Tuesday, June 3, 2008

Spike in enterprise 'events' spurs debut of Event Processing Technical Society

The recent growth -- and expected spike -- in business event data in enterprises has led a group of IT industry leaders to form the Event Processing Technical Society (EPTS), designed to encourage adoption and effective use of event processing methods and technology in applications.

Among the founding members are such heavy hitters as IBM, Oracle, TIBCO Software, Inc., Gartner Research, Coral8 Inc., Progress Software, and StreamBase.

Event processing pioneer Dr. David Luckham, a founding member of EPTS, explained in a press release:

“We've had decades of development of event processing technology for simulation systems, networking, and operations management. Now, the explosion in the amount of business event data being generated in modern enterprises demands a new event processing technology foundation for business intelligence and enterprise management applications.”

EPTS has five initial goals:
  • Document usage scenarios where event processing brings business benefit

  • Develop a common event-processing glossary for its members and the community-at-large to use when dealing with event processing

  • Accelerate the development and dissemination of best practices for event processing

  • Encourage academic research to help establish event processing as a research discipline and encourage the funding of applied research

  • Work with existing standards development organizations such as Object Management Group (OMG), OASIS and W3C to assist in developing standards in the areas of: event formats, event processing interoperability, event processing (meta) modeling and (meta) languages.
EPTS, which does not plan to develop standards itself, has already begun work on an initial draft of the proposed glossary. A use-case work group is generating templates around documentation and presentation of the use cases.

Event processing was a hot topic at the recent TIBCO user conference, TUCON. (Disclosure: TIBCO is a sponsor of BriefingsDirect podcasts.)

Fellow ZDNet blogger Joe McKendrick has some thoughts on event processing, too.

The new consortium plans three additional work groups. The first will focus on developing information on event processing architecture. Another will identify requirements for the interoperability among event processing applications and platforms. The third will collaborate with the academic community to develop courses in this area.

The advance in the scale and complexity of streams of events will place a greater burden on infrastructure and architects. But the ability to manage and harvest analysis from these events could be extremely powerful, and provide a lasting differentiator for expert practitioners.

While the processing of such events has its roots in financial companies and transactions, the engine for dealing with such throughputs and variable paths will find uses in many places. The vaulting commerce expected as always-on mobile Web, GPS location and social graph data collide is a prime example.

We hit on these types of transactions as the progeny of online advertising in a recent BriefingsDirect Analyst Insights roundtable podcast.

Consumers and end users should begin to enjoy what they may well perceive as "intelligent" services -- based on the fruits of complex events processing -- from their devices and providers. Harvesting and using more data from sensors and device meshes will also require the scale that event processing requires.

We should also chalk this up to yet another facet of the growing definition of cloud computing, as event processing as a service within a larger set of cloud-based services will also build out in the coming years. The whole trend of event processing bears close monitoring.

EPTS will hold its next meeting Sept, 17-19 in Stamford, Conn. More information on the consortium can be found at the EPTS Web site.

Friday, May 23, 2008

Microsoft opens philosophical can of worms with Live Search Cashback

Talk is bubbling up across the blogosphere, Gillmor Gang and Techmeme daily about social graph personal information. This may be among the most important discussions and topics of our time. How the "social mesh" works out now will affect our lives and businesses for a long time. It may even impact how we define what "me" is online. We really need to get it right, ASAP.

Yet much of the talk focuses on technology, privacy, use rights and still loosely defined standard approaches to protecting user control over data. It's still murky about how the online social network services will own and control the user- and relationships-defining data inside of their social networks, including Twitter. But there's a larger set of issues that has to do with how we want technology and the Internet to affect us people, as a business, as a society, as a market of markets and as a species.

UPDATE: Many of these issues came up, especially toward the end, of Friday's Gillmor Gang with Google Director of Engineering David Glazer. One takeaway is that, ironically, Microsoft should be among Google Friend Connect's best friends.

The discussion on social graph data portability gets to a philosophical level quickly, because the ways we have codified our personal relationships to each other -- and to larger organizations or power centers -- over eons does not necessarily apply adequately to the new virtual boundaries. It's hard to know on the Web what defines the rights of the individual, the family, tribe, community, company, village, town, state, nation, civilization, race, or species. Do accepted and proven cultural patters offline fully translate into social patterns online?

The older established "contracts" -- from Codex Hammurabi to Magna Carta to Mayflower Compact to U.S. Constitution to the User Terms of Agreement -- do not seem to get the job fully done anymore. It's not clear what I am entitled to online, whereas I'm pretty sure I know what I'm entitled to offline, and I know what to do to enforce getting what I'm entitled to offline legally, ethically and politically.

In essence, we as online users and small businesses don't have any social-order contracts with the online providers, other than what their lawyers put in the small print when you "accept" their free or paid services. And, of course, they have made available their privacy policies for all to see. So there. Click away, users galore, while they store away the user data and relationships analytics.

As a person, you only retain the right not to click (as long as you pay throughout the two-year user subscription agreement, or suffer the penalty charge for leaving). If you're lucky you'll be able to take your phone number with you if you walk, but not necessarily your email address, or your contacts, your social interactions definitions. Most of the data about whatever you did while nestled in the rosy social bosom of their servers, remains with them unless the volunteer to let it be open. So far.

Without belaboring the implications on the metaphysical scale, my point is to show that how our online social interactions as currently defined and controlled place us into uncharted territory. And as with any social contracts, the implicit and explicit ramifications of where we find ourselves later on needs to taken very seriously.

We'll want the ability to back out, if the unforeseen future warrants it, without too much pain, with our open data in tact. We should all want escape clauses for what we do online the next several years, just to be safe. Who you gonna call if it's not fair?

If things don't go well for the user or individual business, what could be done? Because this is about the Web, there isn't a government to lobby, a religious doctrine to fall back on, a meta data justice code of conduct, nor an established global authority to take directives from. The older forms of social contract enforcement don't have a clue. There is only the User Terms of Agreement, the codex of our time. Read it and weep.

Because this is about the Web, the early adopters basically make it up as they go and hope for the best. It's been a great ride. The service providers try and keep up with the fast-changing use patterns, and then figure out a business model that has legs. They write up more User Terms of Agreement. Startups get funded based on their ability to get some skin in the game, even without a business model. They show the investors the User Terms of Agreement, and get their rounds. More work goes into the User Agreements than into the infrastructure to keep the thing working once the clicks come.

This laissez-faire attitude has worked pretty darn well for building out the Web as an industry, thankfully. But now we're talking about more than building out the no-holds-barred Web, we're talking about social contracts ... We're talking about what the user possesses from their role in building out the Web, in populating the social networks, the authoring of the blogosphere. Is there any social collective ownership or rights by the participants in the Web? Or is it only really -- in the final analysis -- owned those who control the means of production of the services?

There's the Web, and there's the blogosphere -- are they they same? What rights does the individual, the person, the blog entity have on the commercial Web? Does the offline me possess the same social powers online? I really don't know.

What's clear is that people like Mike Arrington, Marc Cantor, Steve Gillmor, Robert Scoble and Dave Winer (among many others) want as much freedom about what they do online as what Western Civilization has endowed on them and their ancestors offline. In some circles, and some of these people, want even more social power online than what has been the norm offline. More power to them.

There is a power clash a brewin'. The U.S. has long struggled over states rights versus federal rights. The individual has looked to both -- and pitted them against each other -- to define and protect individual rights.

But what about online? When push comes to shove, how does the individual rights assert themselves against what the services provider can perfectly legally assert? If the server farm says they own your online address book, they probably do legally (see the Use Terms). If they say they own the meta data from your click stream on their servers over the past three years, they probably do.

So far, user rights have been strictly voluntary on behalf of the providers. Some are built into agreements. The needed rising tide of online adoption patterns and essential need to generate traffic and clicks has protected users, to a point. Let's hope it continues. I hope voluntary is enough.

Folks, you should recognize that you already have a lot of power, given the fact that social networks are falling all over themselves to show how "open" they are. They fear that you can and will bolt, even if you lose some data (the first time). Data portability is recognized by the Googles and Microsofts as hugely important, shouldn't it be huge to all of us, too?

Because as we move to always-on social interactions across all we do on the Web, what we do socially online may begin to outweigh what we do socially offline. For some of us this is already true. What distinguishes us as online or offline is blurred, and I believe will grow more so and any difference will become irrelevant.

I am social, therefore I am social. It will not matter how or where. Yet online, the fabric of control over my social universe is more under the influence of the User Terms of Agreement than anything else. Will I lose any part at all of the personal freedoms won by my ancestors when I move my social activities online?

What defines any person by what they do online -- is this a business agreement based on User Terms of Agreement or something more defined by centuries-old social contracts and mores. Does freedom trump user agreements?

When would a concept like human freedom trump any user agreement, even if it is well documented in Delaware courts? Am I free to take my social graph data, that which defines me as me, with me anywhere online because it's an inalienable right? If so, I should not need any OpenSocial standards. It's self-frickin-evident! I should not need it in the User Terms of Agreement because it's long established as precedent.

But here's the rub that came to the surface this week when Microsoft crossed the Rubicon in the Web world with Live Search Cashback.

If users can and will assert that their social graph information is theirs by virtue of their culturally endowed freedom as a human, then what about their "commerce graph?" Who you are but what you buy is not too much different as who you are buy whom you associate with. Is commerce social, or is being social commerce?

My social graph contains my person meta data and my index of contacts, their context to me, and what actually defines me as a social creature. My commerce graph exists too, it's on Amazon, Walmart.com, and dozens of other vendors that know me by how I shop, learn, peruse, compare and perhaps buy. If I search as part of the shopping process then my commerce graph is on Google, Yahoo! and Microsoft (mostly on Google). I do commerce through my social activities, and I may want a social network with those I buy from and sell to.

All this user intentions and activities information is related and should not be separated. I should be able to mix and match my data regardless of the server. I reached those servers through my own device and browser, I made those clicks and punched those keys on my machine before they showed up on someone else's. I own my actions as a free human.

Microsoft is now finding ways to build out a business model via Live Search Cashback (with more to come no doubt) that takes your commerce graph and in essence, sells or barters it to the sellers of goods and services. I'm not saying this is in any way bad, or unproductive. It seems a logical outcome of all that has preceded it online. I expect others to follow suit.

But it does have me wondering. Who owns my commerce graph? Isn't it connected to my social graph? And if Microsoft can make money off of it, why can't I? Can I only make money off of my commerce graph when I use only a certain providers' services and only through its partners? If so, then it's not really my commerce graph. I'm only as free as the User Terms of Agreement say.

If my social graph is mine, and I can move and use it freely, then I surely will want the same to be true for my commerce graph (or any other user pattern graph). This is an essential unalienable right, but I think I want it in writing.

So, please, in order for any of us progeny of Western Civilization to use any of these burgeoning online services, can we have all of this freedom business spelled out clearly in the User Terms of Agreement?

Let's make it the first line item for all online agreements from now on: "Dear User, You are a human and you are free and so that also pertains to everything you do on our Web sites and services."

Until we have technical standards or neutral agencies to route and offer our control over our own use data, then we should all insist on better User Terms of Agreement, those that spell out the obvious. We are free, our data is ours, we should be able to control it.

Wednesday, May 21, 2008

ZoomInfo spins off 'bizographic' platform for controlled circulation online advertising play

Business information provider ZoomInfo has spun off its advertising business units in a new company, Bizo, offering a targeted B2B advertising platform, or what it calls "bizographic" advertising.

Privately held and venture-backed ZoomInfo, Waltham, Mass., announced a new set of business segments last fall, but has now taken the additional step of spinning the unit out. Former general manager and senior vice president Russell Glass will serve as CEO of the new company, which is expected to launch later this year. [Disclosure: ZoomInfo has been a sponsor of some BriefingsDirect B2B podcasts and videocasts that I have produced.]

Bizographic advertising, as ZoomInfo explains it, provides highly targeted demographic and behavioral advertising, allowing marketers to target their online advertising based on the audience of a site instead of the content.

For example, if a company wants to reach technology decision makers for an IT product offering or high-income individuals for a platinum credit card offer, it could use bizographic advertising to target directors of IT or CEOs respectively.

The field has heated up recently as CBS intends to acquire CNET (parent company of this blog's host, ZDNet) and it's BNET division, which also slices and dices audiences by work and functional definitions for the benefit of advertising targeting. Could Bizo also be on the block?

According to ZoomInfo officials, Bizo will continue to leverage the company’s understanding of business people and companies to allow marketers to target business users based on thousands of segmenting possibilities, including combinations of title, company, industry, functional area, company size, education, location, etc. The company expects over 20 million targetable business users in its network, when it launches.

Bryan Burdick, ZoomInfo's president explained the move:

"While B2B advertising is complimentary to ZoomInfo’s business, the market has been starved for the ability to target business professionals online. Creating a new business in order to meet that need was an ideal solution for us."

I gave my readers a head's up on what I called "controlled circulation advertising" last December, referring specifically to ZoomInfo:

ZoomInfo is but scratching the surface of what can be an auspicious third (but robust) leg on the B2B web knowledge access stool. By satisfying both seekers and providers of B2B information on business needs, ZoomInfo can generate web page real estate that is sold at the high premiums we used to see in the magazine controlled circulation days. Occupational-based searches for goods, information, insights and ongoing buying activities is creating the new B2B controlled circulation model.

ZoomInfo, a business information search engine, finds information about industries, companies, people, products and services. The company’s semantic search engine continually crawls millions of company Websites, news feeds and other online sources to identify company and people information, which is then organized into profiles.

ZoomInfo currently has profiles on nearly 40 million people and over 4 million companies, and its search engine adds more than 20,000 new profiles every day.

Splunk goes virtual, unveils broad IT search capabilities for Citrix XenServer

Splunk, which provides indexing and search technology for IT infrastructures, this week made its move into the virtual realm with the announcement of Splunk for Citrix XenServer Management.

The San Francisco company says this is just its first foray into search support services for virtualization and that it will release similar applications for each of the leading server virtualization platforms in the near future. [Disclosure: Splunk is a sponsor of BriefingsDirect podcasts.]

The Splunk announcement comes during a Citrix cavalcade of news and developments, including the expected delivery of its desktop as a service portfolio.

While server virtualization provides significant efficiency and utilization improvement benefits to datacenters, it also brings complexity in troubleshooting glitches. Performance and capacity issues can arise when applications share the same physical host. With multiple virtual machines (VMs) sharing a pool of server, storage and network resources, changes to any one layer or VM could potentially affect others – and the applications they contain. Root cause analysis is even more of a challenge when instances of virtualized containers and runtimes pop in and out of use via dynamic provisioning.

Splunk indexing and search approach aims to provide a full view of IT-generated use data, not only from the hypervisor and VM, but from the server, guest operating system, applications, and the network. Splunk’s technology indexes data across all tiers of the infrastructure in near real-time. This allows operators and administrators to maintain a large, dynamic IT environment with fewer people, with higher automation and easier service performance management.

Splunk for Server Virtualization Management supports virtualization planning, workload optimization, performance monitoring, root cause analysis and log management, says the company.

The new product is available immediately. Users can download a free 30-day trial from the company's Web site.

Splunk has been in the news lately, and on Monday announced that communications provider BT has agreed to license Splunk's IT search platform technology to build a managed-security product that will allow customers to preserve 100 percent of the logs on a network.

Three weeks ago, the company unveiled Splunk for Change Management, an application to audit and detect configuration and changes, and Splunk for Windows, which indexes all data generated by Windows servers and applications.

Tuesday, May 20, 2008

IBM executive defines next generation of enterprise datacenters through cloud computing

IBM Vice President for Enterprise Systems Rich Lechner took the stage at the Forrester Research IT Forum on Tuesday to explore the definition of new enterprise datacenters that will enable new levels of business innovation.

Factors buffeting the definition of the new class of datacenterinclude globalization, a rising tide of information and need for expanded flexibility and adaptability for business models.

To compete, companies need to operate without boarders, and bcome a globally integrated enterprise. "There are huge resource pools emergig around the world ... with new ideas and creativity," said Lechner. It's more than outsourcing, he said, it's about integrating these resources.

The tide of data and devices, of resources, and assets will continue to explode. How can you best use the data that flows all around you?

New business models will evolve, said Lachner. The impact of social networking and peer influences on buying decisions are just beginning to be felt.

Virtualization will remake the landscape IT, as will cloud computing, virtual worlds, and high new levels of scaling when it comes to compute power, said Lechner.

Cloud computing allows an unbounded aspiration of the best user experiences. "It provides anytime, anywhere access to IT resources deliver dynamically as a service," he said. Cloud computing expands capacity almost indefinitely.

IBM's cloud initiatives are allowing technology incubation, data-intense workloads, government-led initiatives and new types of software development support.

IT plus cloud computing can enable change. How to get started? Simplify using virtualization, share infrastructures via SOA, and create a dynamic ability to access data and knowledge, said Lechner.

The world is changing to enterprises without borders, unbounded IT infrastructure, and huge more data sets, and a need for collaboration that increasingly crosses many organizational and sourcing types, he said.

Additionally IBM is learning a lot from Google and vice versa when it comes to cloud computing, said Lechner. Cloud computing allows its practitioners to isolate compute units and make their use far more efficient economically via dynamic provisioning.

For data security, users can physically isolate data using partitioning. IBM for years has been hosting multiple companies on single mainframes with no data protection or privacy issues. The technology exists to leverage the economics of cloud computing while protecting data, said Lechner.

"It's about removing IT has an inhibitor," he said.

Business imperatives theme dominates Forrester's IT Forum conference opener

Forrester Research, the Cambridge, Mass. market research and analysis firm, kicked off its influential IT Forum conference today in Las Vegas with a keynote address by founder and Chairman George Colony on CEO success imperatives.

What the success imperatives do you have? Colony asked a series of CEOs. Here are the seven answers he got:

  • Getting, keeping and building the best people.
  • Engendering collaboration.
  • Reaching global markets.
  • Increasing profit.
  • Building a positive culture.
  • Customers, customers, customers
  • Driving innovation.

What was missing from the list? Colony asked the crowd. Technology ... it didn't make the top imperatives, based on CEO priorities. So to move from IT to business technology, there needs to be more connection between what the IT executives focus on to what the CEO focuses on.

Colony ended his introduction to the event with the pithy conclusion that technology is buried in business imperatives, rather than is an imperative itself.

Then Forrester executives and research directors Mike Gilpin and Eric Brown took to stage to welcome the 1,400 conference goers to the 14th annual IT Forum at the Sands Expo Center. They showed McKinsey research that shows that innovation is essential to companies and their growth.

They define innovation as top down and bottom up inside of companies. The cite the iPhone as an example of this innovation. And they cite Amazon's one-click buying process as another. Also, the one laptop per child intitiative and mobile networks in developing markets signal innovation.

Business innovation needs to pull this all together, say Gilpin and Brown.

Forrester VP and Principal Analyst Bobby Cameron implored the conference crowd not to wait to innovate. Businesses need innovation but IT is disconnected from innovation, Cameron said. Part of the goal of the conference is to rectify this.

"Companies say that technology is transformational, but they invest in technology to improve efficiency or reduce costs," said Cameron, based on his research. "They don't do what they say."

Innovation gets stalled. "Sludge in IT's engine stymies innovation," said Cameron. IT needs to stop hesitating, not just focus on costs, and move beyond "heavy processes" and grow more fleet.

Business innovation needs to transform processes, and boost the value and impact of the business on customers and partners, he said

There is confusion on what leads to innovation, said Cameron. "People aren't asking the right questions," he said, adding that investments are decoupled ineffectively from game-changing ideas.

"The innovation continuum" needs to extend across all aspects of business investments and thought leadership and new ideas. There needs to be a better way to join the two, and to get the money to act on good ideas, said Cameron.

Collaboration networks can help bring inventors and transformers into the innovation continuum, even if they exist outside of the company. IT shops need to play the role of brokers for innovation, he said, and to better play the roles of inventor, transformer, financier (to a lesser extent), and broker.

IT should build out innovation networks to become transformation agents. And these IT departments need to make it clear that they play this role.

It's up to the business to become adept at funding innovation, on an ongoing and sustained basis. Part of securing funding requries innovation context, innovation networks, and a process for ownership of funding, said Cameron. The result should be an "Innovation Pipeline," that has its own funding, is governed by an innovation team, and which takes in ideas generated from anywhere, said Cameron.

This pipeline runs in parallel to regular business activities. This allows for sustained innovation, year after year. More businesses need to become innovation leaders, said Cameron.

How to start? Build an innovation culture, by bringing in the right people. Make innovation part of the process, using portfolio management and analyzing the portfolio to identify where innovation already exists. Technologies also need to be in place to capture innovative ideas.

MokaFive announces general availability of LivePC desktop-as-a-service offering

MokaFive, a desktop virtualization company, has announced the general release of v.10 of its Virtual Desktop Solution, a cross-platform desktop-as-a-service (DaaS) product.

The Redwood City, Calif. company says its DaaS solution is already deployed in nearly 50 pilot programs and has been downloaded over 80,000 times. The virtual desktops, known as LivePCs, run on Windows, Macintosh, and MokaFive's BareMetal Linux operating systems.

I've blogged about MokaFive's DaaS product before and have explained how it operates:

By creating a "Live PC" desktop, which contains the operating system and application stack, and having it hosted by MokaFive, administrators can distribute, manage, and update the desktop from a single copy on the host computer. Users sync their local desktop with the copy in the cloud, allowing them to always be able to access the latest pristine version.

When synced with the Live PC, the desktop is loaded onto the local device, whether a PC or even a flash drive. It then runs as a virtual machine on that device and users can work online or offline. Changes made by administrators are reflected in the local device whenever users connect to the Live PC. By using a flash drive, users can access their desktop on any x86-based machine, having all their productivity tools at their fingertips, but leaving no footprint behind once the flash device is unplugged.

MokaFive Virtual Desktop Solution is available in two versions. MokaFive Professional is for enterprise and workgroup deployments and will be sold via annual subscription. MokaFive Express, designed for home users and developers, is available as a free download. A library of LivePCs, created by MokaFive and the user community, is available at the MokaFive lab.

Monday, May 19, 2008

Panda Security delivers cloud-based security management service for SMBs

IT security provider Panda Security has unveiled its Managed Office Protection solution, a security-as-a-service offering aimed at small and medium businesses (SMBs) as well as large companies with a significant number of geographically dispersed offices.

The service from the Panda keeps the total cost of ownership (TCO) to a minimum by hosting all information in the cloud and providing a Web-based console through which administrators can configure security resources.

The lower cost also comes from the small footprint of the Panda agent on each PC, at about 5 MBs it's much smaller than other malware download agents. More details at Panda's blog.

Administrators can also assign profiles across the organization to adapt security measures to individual and department requirements. The service-based protection is also geared toward SOHO workers, who may just use outsourced IT support and repair shops or consultants.

The managed protection product provides "collective intelligence" that automatically detects, correlates, and responds to malware across a network of PCs. The remote management tools, allow IT managers -- or support shops -- to use any computer on the Internet to change user specifications, track IP addresses, and enable and disable security features.

Using a centralized Web console, administrators can configure updated information to protect against zero-day attacks. Updates are completed via peer-to-peer networks from the nearest desktop, minimizing bandwidth consumptions.

Real-time information about detection activity can be accessed by administrators on the Web console. Administrators can be sent suspected threats to PandaLabs for analysis. Periodic security audits can ensure compliance with such regulations as SOX, PCI, HIPAA, among others. Panda provides an ongoing list of current threats.

Because it's a cloud-based service, it can react in near real-time to Internet hazards as they arise, then jettison the updates as small deltas out to the admins or directly to supported PCs. Naturally, the service only supports Windows, but it goes back as Windows 95 and up to Vista. Panda is looking at Mac OS X and Linux support, but demand has not been there, given Windows propensity as a malware target.

Managed Office Protection is available to value added resellers looking to offer security services to clients. Pricing is in the $40 per user per year range. In a related announcement, Panda said that Tech Data Corp., Clearwater, Fla., has signed an exclusive distribution agreement for the product.

I'd like to see the remote access and remore PC support crowd coordinate better with suppliers like Panda. Any and all PC support shold just include services like this. Already many do, but the SOHO market still needs more convenient approaches at the price point Panda is providing.

Panda Managed Office Protection is available immediate and can be downloaded from the Panda Web site.