Addressing a Discover audience for the first time since HPE announced spinning off many software lines to Micro Focus, Meg Whitman, HPE President and CEO, said that company is not only committed to those assets, becoming a major owner of Micro Focus in the deal, but building its software investments.
HPE is not getting out of software but doubling-down on the software that powers the apps and data workloads of hybrid IT.
"HPE is not getting out of software but doubling-down on the software that powers the apps and data workloads of hybrid IT," she said Tuesday at London's ExCel exhibit center.
"Massive compute resources need to be brought to the edge, powering the Internet of Things (IoT). ... We are in a world now where everything computes, and that changes everything," said Whitman, who has now been at the helm of HPE and HP for five years.
HPE's new vision: To be the leading provider of hybrid IT, to run today's data centers, and then bridge the move to multi-cloud and empower the intelligent edge, said Whitman. "Our goal is to make hybrid IT simple and to harness the intelligent edge for real-time decisions" to allow enterprises of all kinds to win in the marketplace, she said.
Hyper-converged systems
To that aim, the company this week announced an extension of HPE Synergy's fully programmable infrastructure to HPE's multi-cloud platform and hyper-converged systems, enabling IT operators to deliver software-defined infrastructure as quickly as customers' businesses demand. The new solutions include:
- HPE Synergy with HPE Helion CloudSystem 10 -- This brings full composability across compute, storage and fabric to HPE's OpenStack technology-based hybrid cloud platform to enable customers to run bare metal, virtualized, containerized and cloud-native applications on a single infrastructure and dynamically compose and recompose resources for unmatched agility and efficiency.
- HPE Hyper Converged Operating Environment -- The software update
leverages composable technologies to deliver new capabilities to the HPE Hyper Converged 380,
including new workspace controls that allow IT managers to compose and
recompose virtualized resources for different lines of business, making
it easier and more efficient for IT to act as an internal service
provider to their organization.This year's HPE Discover was strong on showcasing the ecosystem approach to creating and maintaining hybrid IT.
This year's HPE Discover was strong on showcasing the ecosystem approach to creating and maintaining hybrid IT. Heavy hitters from Microsoft Azure, Arista, and Docker joined Whitman on stage to show their allegiance to HPE's offerings -- along with their own -- as essential ingredients to Platform 3.0 efficiency.
See more on my HPE Discover analysis on The Cube.
HPE also announced plans to expand Cloud28+, an open community of commercial and public sector organizations with the common goal of removing barriers to cloud adoption. Supported by HPE's channel program, Cloud28+ unites service providers, solution providers, ISVs, system integrators, and government entities to share knowledge, resources and services aimed at helping customers build and consume the right mix of cloud solutions for their needs.
Internet of Things
Discover 2016 also saw new innovations designed to help organizations rapidly, securely, and cost-effectively deploy IoT devices in wide area, enterprise and industrial deployments. These solutions include:
- HPE Mobile Virtual Network Enabler
- HPE Universal IoT Platform
- Aruba ClearPass Universal Profiler
- Aruba 2540 Series Switches
As organizations integrate IoT into mainstream operations, the onboarding and management of IoT devices remains costly and inefficient particularly at large scale. Concurrently, the diverse variations of IoT connectivity, protocols and security, prevent organizations from easily aggregating data across a heterogeneous fabric of connected things.
The edge of the network is becoming a very
crowded place, but these devices need to be made more useful.
To improve the economies of scale for massive IoT deployments over wide area networks, HPE announced the new HPE Mobile Virtual Network Enabler (MVNE) and enhancements to the HPE Universal IoT (UIoT) Platform.
As the amount of data generated from smart “things” grows and the frequency at which it is collected increases, so will the need for systems that can acquire and analyze the data in real-time. Real-time analysis is enabled through edge computing and the close convergence of data capture and control systems in the same box.
HPE Edgeline Converged Edge Systems converge real-time analog data acquisition with data center-level computing and manageability, all within the same rugged open standards chassis. Benefits include higher performance, lower energy, reduced space, and faster deployment times.
"The intelligent edge is the new frontier of the hybrid computing world," said Whitman. "The edge of the network is becoming a very crowded place, but these devices need to be made more useful."
This means that the equivalent of a big data crunching data center needs to be brought to the edge affordably.
Biggest of big data
"IoT is the biggest of big data," said Tom Bradicich, HPE Vice President and General Manager, Servers and IoT Systems. "HPE EdgeLine and [partner company] PTC help bridge the digital and physical worlds for IoT and augmented reality (AR) for fully automated assembly lines."
IoT and data analysis at the edge helps companies finally predict the future, head off failures and maintenance needs in advance. And the ROI on edge computing will be easy to prove when factory downtime can be greatly eliminated using IoT, data analysis and AR at the edge everywhere.
Along these lines, Citrix, together with HPE, has developed a new architecture around HPE Edgeline EL4000 with XenApp, XenDesktop and XenServer to allow graphically rich, high-performance applications to be deployed right at the edge. They're now working together on next-generation IoT solutions that bring together the HPE Edge IT and Citrix Workspace IoT strategies.
I predict that HPC will be a big driver for HPE, both in private cloud
implementations and in supporting technical differentiation for HPE
customers and partners.
In related news, SUSE has entered into an agreement with HPE to acquire technology and talent that will expand SUSE's OpenStack infrastructure-as-a-service (IaaS) solution and accelerate SUSE's entry into the growing Cloud Foundry platform-as-a-service (PaaS) market.
The acquired OpenStack assets will be integrated into SUSE OpenStack Cloud, and the acquired Cloud Foundry and PaaS assets will enable SUSE to bring to market a certified, enterprise-ready SUSE Cloud Foundry PaaS solution for all customers and partners in the SUSE ecosystem.
As part of the transaction, HPE has named SUSE as its preferred open source partner for Linux, OpenStack IaaS, and Cloud Foundry PaaS.
#HPE also put force behind its drive to make high performance computing (HPC) a growing part of enterprise data centers and private clouds. Hot on the heels of buying SGI, HPE has recognized that public clouds leave little room for those workloads that do not perform best in virtual machines.
Indeed, if all companies buy their IT from public clouds, they have little performance advantage over one another. But many companies want to gain the best systems with the best performance for the workloads that give them advantage, and which run the most complex -- and perhaps value-creating -- applications. I predict that HPC will be a big driver for HPE, both in private cloud implementations and in supporting technical differentiation for HPE customers and partners.
Memory-driven computing
Computer architecture took a giant leap forward with the announcement that HPE has successfully demonstrated memory-driven computing, a concept that puts memory, not processing, at the center of the computing platform to realize performance and efficiency gains not possible today.
Developed as part of The Machine research program, HPE's proof-of-concept prototype represents a major milestone in the company's efforts to transform the fundamental architecture on which all computers have been built for the past 60 years.
Gartner predicts that by 2020, the number of connected devices will reach 20.8 billion and generate an unprecedented volume of data, which is growing at a faster rate than the ability to process, store, manage, and secure it with existing computing architectures.
"We have achieved a major milestone with The Machine research project -- one of the largest and most complex research projects in our company's history," said Antonio Neri, Executive Vice President and General Manager of the Enterprise Group at HPE. "With this prototype, we have demonstrated the potential of memory-driven computing and also opened the door to immediate innovation. Our customers and the industry as a whole can expect to benefit from these advancements as we continue our pursuit of game-changing technologies."
We
have achieved a major milestone with The Machine research project --
one of the largest and most complex research projects in our company's
history.
The proof-of-concept prototype, which was brought online in October, shows the fundamental building blocks of the new architecture working together, just as they had been designed by researchers at HPE and its research arm, Hewlett Packard Labs. HPE has demonstrated:
- Compute nodes accessing a shared pool of fabric-attached memory
- An optimized Linux-based operating system (OS) running on a customized system on a chip (SOC)
- Photonics/Optical communication links, including the new X1 photonics module, are online and operational
- New software programming tools designed to take advantage of abundant persistent memory.
In addition to bringing added capacity online, The Machine research project will increase focus on exascale computing. Exascale is a developing area of HPC that aims to create computers several orders of magnitude more powerful than any system online today. HPE's memory-driven computing architecture is incredibly scalable, from tiny IoT devices to the exascale, making it an ideal foundation for a wide range of emerging high-performance compute and data intensive workloads, including big data analytics.
Commercialization
HPE says it is committed to rapidly commercializing the technologies developed under The Machine research project into new and existing products. These technologies currently fall into four categories: Non-volatile memory, fabric (including photonics), ecosystem enablement and security.
Martin Banks, writing in Diginomica, questions whether these new technologies and new architectures represent a new beginning or a last hurrah for HPE. He poses the question to David Chalmers, HPE's Chief Technologist in EMEA, and Chalmers explains HPE's roadmap.
The conclusion? Banks feels that the in-memory architecture has the potential to be the next big step that IT takes. If all the pieces fall into place, Banks says, "There could soon be available a wide range of machines at price points that make fast, high-throughput systems the next obvious choice. . . . this could be the foundation for a whole range of new software innovations."
Storage initiative
HPE lastly announced a new initiative to address demand for flexible storage consumption models, accelerate all-flash data center adoption, assure the right level of resiliency, and help customers transform to a hybrid IT infrastructure.
Over the past several years, the industry has seen flash storage rapidly evolve from niche application performance accelerator to the default media for critical workloads. During this time, HPE's 3PAR StoreServ Storage platform has emerged as a leader in all-flash array market share growth, performance, and economics. The new HPE 3PAR Flash Now initiative gives customers a way to acquire this leading all-flash technology on-premises starting at $0.03 per usable Gigabyte per month, a fraction of the cost of public cloud solutions.
This keynote address and the news makes more sense as pertains to current and future IT market than I’ve ever seen.
"Capitalizing on digital disruption requires that customers be able to flexibly consume new technologies," said Bill Philbin, vice president and general manager, Storage, Hewlett Packard Enterprise. "Helping customers benefit from both technology and consumption flexibility is at the heart of HPE's innovation agenda."
Whitman's HPE, given all of the news at HPE Discover, has assembled the right business path to place HPE and its ecoystems of partners and alliances squarely the very center of the major IT trends of the next five years.
Indeed, I’ve been at HPE Discover conferences for more than 10 years now, and this keynote address and the news makes more sense as pertains to current and future IT market than I’ve ever seen.
You may also be interested in:
- How Propelling Instant Results to the Excel Edge Democratizes Advanced Analytics
- How ServiceMaster Develops Applications with a Security-Minded Focus as a DevOps Benefit
- How JetBlue Mobile Applications Quality Assurance Leads to Greater Workforce Productivity
- How Software-defined Storage Translates into Just-In-Time Data Center Scaling
- How Cutting-Edge Storage Provides a Competitive Footing for Canadian Music Provider SOCAN
- Strategic DevOps -- How Advanced Testing Brings Broad Benefits to Operations and Systems Monitoring for Independent Health
- How Always-Available Data Forms the Digital Lifeblood for a University Medical Center
- Loyalty Management Innovator Aimia's Transformation Journey to Modernized and Standardized IT
- How HudsonAlpha Innovates on IT for Research-Driven Education, Genomic Medicine, and Entrepreneurship
No comments:
Post a Comment