We're very early in the private cloud business -- which is precisely why such large and influential vendors as Oracle, Intel, HP, VMware, Citrix and Red Hat are jumping into the market with initiatives and pledges for standards and support. We're seeing some whoppers here at Oracle OpenWorld, from Oracle, Intel and HP in particular.
Why? The early birds that can establish de facto standards on data portability and resources governance -- minding the boundaries between the private and public clouds and their digital condensates -- will be in a position to define the next abstraction of meta operating system (for lack of a better term).
In just the last two weeks, VMware, Citrix and now Oracle have pledged to come to market with the infrastructure that enterprises and service providers alike will want. The cloud wanters need cloud makers, the picks and shovels, to build out on the vision of next-generation data center fabrics -- of dynamic resource pools of infrastructure, platform, data applications and management services.
How these services are supported, and how they are managed to inter-relate with each other and the services-abstracted older IT assets, forms the new uber platform -- the new target through which to attract developers, architects, partners and users -- lots and lots of users all feeding off of huge clouds of dynamic, low-cost services.
Yes, a market critical-mass cloud platform standard implementation could create yet a new way to lock in huge multi-billion-dollar markets to ... need. To need, and to want, and to buy, and to have a heck of a hard time stopping that needing. The picks and shovels. The lock-in, the black hole-pull of the infrastructure, hard to resist, and then ... impossible.
Such a prize! And just like in the past, the crass business interests side of the vendors will want to own, dominate and lock-in to their proprietary platform implementations. Opposing forces, also inside the same vendors, will opine on the need (correctly) for openness and standards to provide the real value the users and ecology players demand. The new lock-in, they will say (correctly) is not technical but in terms of convenience, simplicity, power, and cost. Seduce them, don't force them, might be the mantra.
So seduce or lock-in, early-days private cloud platform definitions require the best management of two sets of boundaries -- one that properly falls between the pubic-facing clouds, and the nascent "private" or on-premises or enterprise clouds. The pay-off comes not just from operating efficiencies but on how well the services generated from either types of cloud can interoperate and play well in supporting extended enterprise and B2C processes.
This need to cross boundaries well will also prompt the handful of public cloud providers (Amazon, Google, Yahoo, Microsoft, Apple, etc.) to embrace sufficient levels of standards-based interoperability. Think of it as mass markets balancing interests ... like globalization ... where economics more than proprietary technologies wins the day.
The second boundary to to be defined properly is between the legacy systems, SOAs, business applications and middleware -- and the private cloud fabrics that will increasingly be where new applications/services are "natively" deployed, and where the integrations to the old stuff occurs. We can really have two kinds of clouds -- one for IT and one for consumers. There needs to be one cloud that suits all of the digital universe, within certain (as yet undefined) parameters. They really need to bet this boundary right so that B2E and B2B is also B2C.
Clouds will, of course, be highly virtualized, and so they will be able to support many of the older proprietary and standard-based IT systems and development environments. But why virtualize the new stuff, too? Why have B2E/B2B old and separately B2B/B2C new? We should want one cloud approach that newer apps and services can target directly, and then virtualize al the older stuff.
The question then is what constitutes the new "native" platform that is of, for, and by the standard cloud. If there is a fairly well-defined, standards-based approach to cloud computing that manages all these boundaries -- between public and private, between the old and the new of IT -- and which can serve as the target for all the new apps, services, data abstractions, modeling tools, workflow/policy/governance/ESBs and development needs -- well that's a business worth shooting for.
Who cares how the lock-in occurs, this is the next $100 billion company business. In other words, getting this right is a very big deal. The time is nigh for defining IT for at least a decade, maybe longer.
But like the Unix wars of old (and the app server wars of not-so-old) there will be jockeying for cloud implementation supremacy, brinkmanship over whose this or that is better, and the high-stakes race for who gets the definitions of the boundaries correct best for the users, developers, channel, and partners. Who can woo the best?
What is different this time, in cloud time, is that there are few players that can play this game, less of a channel to be concerned about, and fewer developer communities to woo. Far more than in the past, developers can use the tools and frameworks of their choice, and the clouds will support them. Users also have new choices -- not between a Mac and a PC, between Unix and x86, between Java and .Net, between Linux and Windows -- but between cloud ecologies of vast services providers. The better the bundle of services (and therefore interop and cooperation), the better the customer attraction and loyalty. The seduction, the lock in, comes from packaging and execution on the services delivery.
More important than in past vendor sporting events, the business model rules. The cloud model that wins is the "preferred cloud model" that gives IT shops in enterprises high performance at manageable complexity and dramatically lower total costs. That same "preferred" cloud attracts the platform as a service developer crowd, allows mashups galore, allows for pay-as-you-use subscription fees. Viral adoption on a global scale. Oh, and the winning cloud also best plays out the subsidy from online advertising in all its forms and permutations.
Yes, we can expect several fruitful years of jockeying among the major vendors, the rain makers for the cloud providers -- and see gathering clouds of alliances among some, and against others. We're only seeing the very beginning of the next chapter of IT in the last few weeks of IT vendor news.
The cloud wars, however, won't be won on technical merits alone, it will be a real beauty pageant too. It will be more of a seduction and an election, less of a slight of hand and leveraging of incumbency ... and that will be a big switch.
Wednesday, September 24, 2008
From OpenWorld, Oracle and HP align forces to modernize legacy apps and spur IT transformation
Listen to the podcast. Download the podcast. Find it on iTunes/iPod. Learn more. Sponsor: Hewlett-Packard.
Read a full transcript of the discussion.
The avenues to IT transformation are many, but the end result must include modernization of data, applications, systems, and operational best practices. It's no surprise then that the partnership of Oracle and Hewlett-Packard gained new ground Sept. 24 at Oracle OpenWorld in San Francisco.
The companies are providing products and services that holistically support the many required variables to successfully implement IT transformation. HP hardware and storage systems have been tuned to support Oracle databases, applications and software infrastructure for many years, and the partnership continues to expand in the age of SOA, legacy modernization, and cloud computing.
To learn more about how HP and Oracle will continue to act in concert, especially as enterprises seek the highest data center performance at the lowest cost, BriefingsDirect interviewed Paul Evans, worldwide marketing lead for IT transformation solutions at HP, and Lance Knowlton, vice president for modernization at Oracle. The discussion took place Sept. 23, 2008 at the Oracle OpenWorld conference.
The application modernization and IT transformation interview, moderated by yours truly from San Francisco, comes as part of a series of discussions with IT executives I’ll be doing this week from the Oracle OpenWorld conference. See the full list of podcasts and interviews.
Read a full transcript of the discussion.
Listen to the podcast. Download the podcast. Find it on iTunes/iPod. Learn more. Sponsor: Hewlett-Packard.
Read a full transcript of the discussion.
The avenues to IT transformation are many, but the end result must include modernization of data, applications, systems, and operational best practices. It's no surprise then that the partnership of Oracle and Hewlett-Packard gained new ground Sept. 24 at Oracle OpenWorld in San Francisco.
The companies are providing products and services that holistically support the many required variables to successfully implement IT transformation. HP hardware and storage systems have been tuned to support Oracle databases, applications and software infrastructure for many years, and the partnership continues to expand in the age of SOA, legacy modernization, and cloud computing.
To learn more about how HP and Oracle will continue to act in concert, especially as enterprises seek the highest data center performance at the lowest cost, BriefingsDirect interviewed Paul Evans, worldwide marketing lead for IT transformation solutions at HP, and Lance Knowlton, vice president for modernization at Oracle. The discussion took place Sept. 23, 2008 at the Oracle OpenWorld conference.
The application modernization and IT transformation interview, moderated by yours truly from San Francisco, comes as part of a series of discussions with IT executives I’ll be doing this week from the Oracle OpenWorld conference. See the full list of podcasts and interviews.
Read a full transcript of the discussion.
Listen to the podcast. Download the podcast. Find it on iTunes/iPod. Learn more. Sponsor: Hewlett-Packard.
Tuesday, September 23, 2008
Amid financial sector turmoil, combined HP-EDS solutions uniquely span public-private divide
Listen to the podcast. Download the podcast. Find it on iTunes/iPod. Learn more. Sponsor: Hewlett-Packard.
Read a full transcript of the conversation.
As we witness unprecedented turmoil throughout the world's financial trading centers, the question in IT circles is: How will this impact the providers of systems, software and services? Not all vendors will fare the same, and those that possess the solutions -- and have the track record and experienced personnel in place -- will be more likely to become part of the new high finance landscape, and the new public-private solutions.
The timing of Wall Street facing some of its darker days comes as HP and the newly acquired EDS unit are combining forces in unique ways. Between them, EDS and HP have been servicing the financial and government sectors for decades. Combined, HP and EDS are uniquely positioned to assist potentially massive transitions and unprecedented public-private interactions.
To learn more about how HP and EDS will newly align, especially amid financial sector turmoil, BriefingsDirect interviewed Maria Allen, vice president of Global Financial Services Industry solutions at EDS. The discussion took place Sept. 22, 2008 at the Oracle OpenWorld conference.
The Allen interview, moderated by your’s truly from San Francisco, comes as part of a series of discussions with IT executives I’ll be doing this week from the conference. See the full list of podcasts and interviews.
Read a full transcript of the conversation.
Listen to the podcast. Download the podcast. Find it on iTunes/iPod. Learn more. Sponsor: Hewlett-Packard.
Read a full transcript of the conversation.
As we witness unprecedented turmoil throughout the world's financial trading centers, the question in IT circles is: How will this impact the providers of systems, software and services? Not all vendors will fare the same, and those that possess the solutions -- and have the track record and experienced personnel in place -- will be more likely to become part of the new high finance landscape, and the new public-private solutions.
The timing of Wall Street facing some of its darker days comes as HP and the newly acquired EDS unit are combining forces in unique ways. Between them, EDS and HP have been servicing the financial and government sectors for decades. Combined, HP and EDS are uniquely positioned to assist potentially massive transitions and unprecedented public-private interactions.
To learn more about how HP and EDS will newly align, especially amid financial sector turmoil, BriefingsDirect interviewed Maria Allen, vice president of Global Financial Services Industry solutions at EDS. The discussion took place Sept. 22, 2008 at the Oracle OpenWorld conference.
The Allen interview, moderated by your’s truly from San Francisco, comes as part of a series of discussions with IT executives I’ll be doing this week from the conference. See the full list of podcasts and interviews.
Read a full transcript of the conversation.
Listen to the podcast. Download the podcast. Find it on iTunes/iPod. Learn more. Sponsor: Hewlett-Packard.
Oracle's Beehive push portends a rethinking of the economics and methods of enterprise messaging
You have to give Oracle credit for persistence. The software giant has been trying to build out its groupware business for neary 10 years, and has as yet modest success.
Now, with Beehive, the next generation of its collaboration suite, Oracle may be sniffing some fresh and meaningful blood in the enterprise messaging waters.
The investment Oracle is making in Beehive, announced this week at the massive Oracle OpenWorld conference in San Francisco, signals an opportunity born more by the shifting sands beneath Microsoft Exchange and Outlook, than in any new-found performance breakthroughs from Oracle's developers.
Here's why: Economics and technology improvements, particularly around virtualization, are bringing more IT functionality generally back to the servers and off of the client PCs. As a result, the client-server relationship between Microsoft Exchange Server and the Outlook client -- and all those massive and costly (albeit risky) .pst files on each PC -- is being broken.
The new relationship is server to browser, or server to thin-client ICA-fed receiver. Here's what the CIO of Bechtel told a group of analysts recently: ""Spend your [IT] money on the backend, not on the front end."
The cost, security risks, and lack of extension of the data inside of Exchange, and on all those end device hard drives, is a non-sustainable IT millstone. Messaging times, the are a changin. Sure, some will ust keep Exchange and deliver the client as Outlook Web Access, or via terminal services.
But what I hear from those CIOs now leverging virtualization and evaluating VDI is that the Exchange-Outlook-SharePoint trifecta for Microsoft is near the top of their list of first strikes to slash costs and move this messaging beast onto the server resources pool where it can be wrestled to the ground and re-architected in an SOA. They have similar thoughts about client-side spread sheets like Excel, to, but that's another blog.
Yep, Exchange and its cotierie is widely acklowledged as coming with an agilty deficit and at a premium TCO -- but with commodity-priced features and functions. For all intents and purposes, email, calendar, files foldering, and even unified messaging functions are free, or at least low-cost features of larger applications function sets or suites.
Enterprises are paying gold for copper, when it comes to messaging and groupware. And then they have to integrate it.
Oracle recognizes that as enterprises move from high-cost, low-flexibity client-server Exchange to services-based server-based messaging -- increasingly extending messaging services in the context of SOA, network sevices like Cisco's SONA, web services, and cloud services -- they will be looking beyond Exchange.
Enterprises over the next several years will be undertaking a rethinking of messaging, from a paradigm, cost and feature set perspective. A big, honking expensive client-server approach will give way to something cheaper, more flexible, able to integrate better, more likey to play well in an on-premises cloud, where the data files are not messaging-system specific. Exchange is a Model T in a Thunderbird world.
Oracle, IBM, Google, Yahoo ... they all have their sights set on poaching and chipping away at the massive and vulnerable global Exchange franchise (just like MSFT did to Lotus and GroupWare). And that pulls out yet another tumbler from Microsoft's enterprise lock-in.
It won't happen overnight, but it will happen. Oracle is betting on it.
Now, with Beehive, the next generation of its collaboration suite, Oracle may be sniffing some fresh and meaningful blood in the enterprise messaging waters.
The investment Oracle is making in Beehive, announced this week at the massive Oracle OpenWorld conference in San Francisco, signals an opportunity born more by the shifting sands beneath Microsoft Exchange and Outlook, than in any new-found performance breakthroughs from Oracle's developers.
Here's why: Economics and technology improvements, particularly around virtualization, are bringing more IT functionality generally back to the servers and off of the client PCs. As a result, the client-server relationship between Microsoft Exchange Server and the Outlook client -- and all those massive and costly (albeit risky) .pst files on each PC -- is being broken.
The new relationship is server to browser, or server to thin-client ICA-fed receiver. Here's what the CIO of Bechtel told a group of analysts recently: ""Spend your [IT] money on the backend, not on the front end."
The cost, security risks, and lack of extension of the data inside of Exchange, and on all those end device hard drives, is a non-sustainable IT millstone. Messaging times, the are a changin. Sure, some will ust keep Exchange and deliver the client as Outlook Web Access, or via terminal services.
But what I hear from those CIOs now leverging virtualization and evaluating VDI is that the Exchange-Outlook-SharePoint trifecta for Microsoft is near the top of their list of first strikes to slash costs and move this messaging beast onto the server resources pool where it can be wrestled to the ground and re-architected in an SOA. They have similar thoughts about client-side spread sheets like Excel, to, but that's another blog.
Yep, Exchange and its cotierie is widely acklowledged as coming with an agilty deficit and at a premium TCO -- but with commodity-priced features and functions. For all intents and purposes, email, calendar, files foldering, and even unified messaging functions are free, or at least low-cost features of larger applications function sets or suites.
Enterprises are paying gold for copper, when it comes to messaging and groupware. And then they have to integrate it.
Oracle recognizes that as enterprises move from high-cost, low-flexibity client-server Exchange to services-based server-based messaging -- increasingly extending messaging services in the context of SOA, network sevices like Cisco's SONA, web services, and cloud services -- they will be looking beyond Exchange.
Enterprises over the next several years will be undertaking a rethinking of messaging, from a paradigm, cost and feature set perspective. A big, honking expensive client-server approach will give way to something cheaper, more flexible, able to integrate better, more likey to play well in an on-premises cloud, where the data files are not messaging-system specific. Exchange is a Model T in a Thunderbird world.
Oracle, IBM, Google, Yahoo ... they all have their sights set on poaching and chipping away at the massive and vulnerable global Exchange franchise (just like MSFT did to Lotus and GroupWare). And that pulls out yet another tumbler from Microsoft's enterprise lock-in.
It won't happen overnight, but it will happen. Oracle is betting on it.
Sybase moves to spur process modeling agility with latest PowerDesigner
Sybase today announced a new version of its PowerDesigner tools, a model-driven approach to crafting and implementing business processes.
PowerDesigner 15 provides modeling and metadata management through a Link and Synch technology, helping to increase impact analysis and providing greater visibility for business analysts.
The main goal, according to Sybase, is to create greater agility by breaking down the silos that currently wall off the various IT elements from each other and from the business goals. See my thoughts on my CEP is stepping up to the plate on similar values. And we've seen a lot of action on improving business process modeling lately.
Key features of PowerDesigner 15 include:
PowerDesigner 15 provides modeling and metadata management through a Link and Synch technology, helping to increase impact analysis and providing greater visibility for business analysts.
The main goal, according to Sybase, is to create greater agility by breaking down the silos that currently wall off the various IT elements from each other and from the business goals. See my thoughts on my CEP is stepping up to the plate on similar values. And we've seen a lot of action on improving business process modeling lately.
Key features of PowerDesigner 15 include:
PowerDesigner 15 is currently scheduled to be available on Oct. 31 and ranges in price from $7,495 to $11,495 per developer seat. More information is available at the PowerDesigner Web site.
- The Link and Synch technology, which captures the intersections between all architectural layers and perspectives of the enterprise.
- An architecture model that allows users to formally capture all metadata relevant to traditional enterprise architecture (EA) analysis.
- An impact analysis diagram that allows visualization of the cascading impact of change and the management of time and costs associated with changes.
- Customizable support for homemade or industry standards.
- A repository Web viewer that allows sharing EA metadata with all stakeholders.
Monday, September 22, 2008
Complex Event Processing goes mainstream with a boost from TIBCO's latest solution
We often hear a lot about how IT helps business with their "outcomes," and then we're shown a flow chart diagram with a lot of arrows and boxes ... that ultimately points to business "agility" in flashing lights.
Sometimes the dots connect, and sometimes there's a required leap of faith that IT spending X will translate into business benefits Y.
But a new box on the flow chart these days, Complex Event Processing (CEP), really does close the loop between what IT does and what businesses want to do. CEP actually builds on what business intelligence (BI), services oriented architecture (SOA), cloud computing, business process modeling (BPM), and a few other assorted acronyms, provide.
CEP is a great way for all the myriad old and new investments in IT to be more fully leveraged to accommodate the business needs of automating processes, managing complexity, reducing risk, and capturing excellence for repeated use.
Based on its proven heritage in financial services, CEP has a lot of value to offer many other kinds of companies as they seek to extract "business outcomes" from the IT departments' raft of services. That's why I think CEP's value should be directed at CEOs, line of business managers, COOs, CSOs, and CMOs -- not just the database administrators and other mandarins of IT.
That's because modern IT has elevated many aspects of data resources into services that support "events." So the place to mine for patterns of efficiency or waste -- to uncover excellence or risk -- is in the interactions of the complex events. And once you done that, not only can you capture those good and bad events, you can execute on them to reduce the risks or to capture and excellence and instantiate it as repeatable processes.
And its in this ability to execute within the domain of CEP that TIBCO Software has introduced today TIBCO BusinessEvents 3.0. The latest version of this CEP harness solution builds on the esoteric CEP capabilities that program traders have used and makes them more mainstream, said TIBCO. [Disclosure: TIBCO is a sponsor of BriefingsDirect podcasts.]
Making CEP mainstream through BusinessEvents 3.0 has required some enhancements, including:
I think that CEP offers the ability to extract real and appreciated business value from a long history of IT improvements. If companies like BI, and they do, then CEP takes off where BI leaves off, and the combination of strong capabilities in BI and CEP is exactly what enterprises need now to provide innovation and efficiency in complex and distributed undertakings.
And TIBCO's products are pointing up how now to take the insights of CEP into the realm of near real-time responses and ability to identify and repeat effective patterns of business behaviors. Dare I say, "agility"?
Sometimes the dots connect, and sometimes there's a required leap of faith that IT spending X will translate into business benefits Y.
But a new box on the flow chart these days, Complex Event Processing (CEP), really does close the loop between what IT does and what businesses want to do. CEP actually builds on what business intelligence (BI), services oriented architecture (SOA), cloud computing, business process modeling (BPM), and a few other assorted acronyms, provide.
CEP is a great way for all the myriad old and new investments in IT to be more fully leveraged to accommodate the business needs of automating processes, managing complexity, reducing risk, and capturing excellence for repeated use.
Based on its proven heritage in financial services, CEP has a lot of value to offer many other kinds of companies as they seek to extract "business outcomes" from the IT departments' raft of services. That's why I think CEP's value should be directed at CEOs, line of business managers, COOs, CSOs, and CMOs -- not just the database administrators and other mandarins of IT.
That's because modern IT has elevated many aspects of data resources into services that support "events." So the place to mine for patterns of efficiency or waste -- to uncover excellence or risk -- is in the interactions of the complex events. And once you done that, not only can you capture those good and bad events, you can execute on them to reduce the risks or to capture and excellence and instantiate it as repeatable processes.
And its in this ability to execute within the domain of CEP that TIBCO Software has introduced today TIBCO BusinessEvents 3.0. The latest version of this CEP harness solution builds on the esoteric CEP capabilities that program traders have used and makes them more mainstream, said TIBCO. [Disclosure: TIBCO is a sponsor of BriefingsDirect podcasts.]
Making CEP mainstream through BusinessEvents 3.0 has required some enhancements, including:
- Decision Manager, a new business user Interface that helps business users write rules and queries that into tap the power of CEP in their domain of expertise.
- Events Stream Processing, a BusinessEvents query language that allows SQL-like queries to target event streams in real-time, which also allows immediate action to be taken on patterns of interest.
- Distributed BusinessEvents, a distributed cache and rules engine that provides massive scaling of events monitoring, as much as twice the magnitude of events monitoring previously possible.
I think that CEP offers the ability to extract real and appreciated business value from a long history of IT improvements. If companies like BI, and they do, then CEP takes off where BI leaves off, and the combination of strong capabilities in BI and CEP is exactly what enterprises need now to provide innovation and efficiency in complex and distributed undertakings.
And TIBCO's products are pointing up how now to take the insights of CEP into the realm of near real-time responses and ability to identify and repeat effective patterns of business behaviors. Dare I say, "agility"?
Saturday, September 20, 2008
LogLogic updates search and analysis tools for conquering IT systems management complexity
Insight into operations has been a hallmark of modern business improvements, from integrated back-office applications to business intelligence (BI) to balanced scorecards and management portals.
But what does the IT executive have to gain similar insight into the systems operations that support the business operations? Well, they have reams of disparate logs and systems analytics data that pour forth every second from all their network and infrastructure devices. Making sense of the data and leveraging the analytics to reduce risk of failure therefore becomes the equivalent of BI for IT.
Now a major BI for IT provider, LogLogic, has beefed up its flagship products with the announcement of LogLogic 4.6. By putting more data together in ways that can be quickly acted on helps companies gain critical visibility into their increasingly complex IT operations, while gaining ease of regulatory compliance along with improved security. [Disclosure: LogLogic is a sponsor of BriefingsDirect podcasts.]
The latest version of the log management tools from San Jose, Calif.-based LogLogic includes new features that help give enterprises a 360-degree view of how business operations are running, including dynamic range selection, graphical trending, and real-time reporting. Among the improvements are:
I have talked extensively to the folks at LogLogic about the log-centered approach to dealing with IT's growing complexity, as systems and services multiply and are spurred on by the virtualization wildfire. Last week I posted a podcast, in which LogLogic CEO Pat Sueltz explained how log-management aids in visibility and creates a favorable return on investment (ROI) for enterprises.
LogLogic 4.6 will be available later this month as a free upgrade to current customers under Support contract. For new customers, pricing will start at $14,995 for the LX appliance, $53,995 for the ST appliance and $37,500 for the MX appliance.
But what does the IT executive have to gain similar insight into the systems operations that support the business operations? Well, they have reams of disparate logs and systems analytics data that pour forth every second from all their network and infrastructure devices. Making sense of the data and leveraging the analytics to reduce risk of failure therefore becomes the equivalent of BI for IT.
Now a major BI for IT provider, LogLogic, has beefed up its flagship products with the announcement of LogLogic 4.6. By putting more data together in ways that can be quickly acted on helps companies gain critical visibility into their increasingly complex IT operations, while gaining ease of regulatory compliance along with improved security. [Disclosure: LogLogic is a sponsor of BriefingsDirect podcasts.]
The latest version of the log management tools from San Jose, Calif.-based LogLogic includes new features that help give enterprises a 360-degree view of how business operations are running, including dynamic range selection, graphical trending, and real-time reporting. Among the improvements are:
The latest release provides improved search for IT intelligence, forensics workflow and advanced secure remote access control. LogLogic 4.6 will be rolled out for the company's family of LX, ST, and MX products, helping large- and mid-sized companies to capture, search and store their log data to improve business operations, monitor user activity, and meet industry standards for security and compliance.
- Index search user interface, including clustering by source, dynamic range selection, trending over time and graphical representation of search results
- Search history, which automatically saves search criteria for later reuse
- Forensics clipboard to annotate, organize, record and save up to 1000 messages per clipboard – up to 100 clipboards per user
- Active directory remote authentication with role-based access control (RBAC)
- Enhanced security via complex password creation
- Enhanced backup/restore and failover, including incremental backup support and "backup now" capability.
I have talked extensively to the folks at LogLogic about the log-centered approach to dealing with IT's growing complexity, as systems and services multiply and are spurred on by the virtualization wildfire. Last week I posted a podcast, in which LogLogic CEO Pat Sueltz explained how log-management aids in visibility and creates a favorable return on investment (ROI) for enterprises.
LogLogic 4.6 will be available later this month as a free upgrade to current customers under Support contract. For new customers, pricing will start at $14,995 for the LX appliance, $53,995 for the ST appliance and $37,500 for the MX appliance.
Genuitec expands Pulse provisioning system beyond tools to Eclipse distros, eyes larger software management role
Genuitec, one of the founders of the Eclipse Foundation, has expanded the reach of its Pulse software provisioning system with the announcement of the Pulse "Private Label," designed to give companies control over their internal and external software distributions.
Until now, Pulse was designed for managing and standardizing software development tools in the Eclipse environment. With Private Label, enterprises can manage full enterprise software delivery for any Eclipse-based product or application suite.
Plans call for subsequently expanding Private Label into a full lifecycle management system for software beyond Eclipse. [Disclosure: Genuitec is a sponsor of BriefingsDirect podcasts.]
Private Label, which can be tailored to customer specifications, can be hosted either by Genuitec or within a corporate firewall to integrate with existing infrastructure. Customers also control the number of software catalogs, as well as their content. Other features include full custom branding and messaging, reporting of software usage, and control over the ability for end-users to customize their software profiles, if desired.
Last month, I sat down for a podcast with Todd Williams, vice president of technology at Genuitec, and we discussed the role of Pulse as a simple, intuitive way to install, update, and share custom configurations with Eclipse-based tools.
Coinciding with the release of Pulse Private Label is the release of Pulse 2.3 for Community Edition and Freelance users. Upgrades include performance improvements and catalog expansion. Pulse 2.3 Community Edition is a free service. Pulse 2.3 Freelance is a value-add service priced at $6 per month per user or $60/year. Pulse Private Label pricing is based on individual requirements.
More information is available at the Pulse site.
Until now, Pulse was designed for managing and standardizing software development tools in the Eclipse environment. With Private Label, enterprises can manage full enterprise software delivery for any Eclipse-based product or application suite.
Plans call for subsequently expanding Private Label into a full lifecycle management system for software beyond Eclipse. [Disclosure: Genuitec is a sponsor of BriefingsDirect podcasts.]
Private Label, which can be tailored to customer specifications, can be hosted either by Genuitec or within a corporate firewall to integrate with existing infrastructure. Customers also control the number of software catalogs, as well as their content. Other features include full custom branding and messaging, reporting of software usage, and control over the ability for end-users to customize their software profiles, if desired.
Last month, I sat down for a podcast with Todd Williams, vice president of technology at Genuitec, and we discussed the role of Pulse as a simple, intuitive way to install, update, and share custom configurations with Eclipse-based tools.
Coinciding with the release of Pulse Private Label is the release of Pulse 2.3 for Community Edition and Freelance users. Upgrades include performance improvements and catalog expansion. Pulse 2.3 Community Edition is a free service. Pulse 2.3 Freelance is a value-add service priced at $6 per month per user or $60/year. Pulse Private Label pricing is based on individual requirements.
More information is available at the Pulse site.
Subscribe to:
Posts (Atom)