Monday, April 21, 2008

Kapow's Web-to-spreadsheet data service helps enterprises exploit cloud-based mashups

Kapow Technologies at the Web 2.0 Expo this week will aim to solve one of the biggest problems facing enterprises as they seek to solve external-internal data chaos by leveraging cloud-based data management services.

With Kapow OnDemand, a cloud-based service that uses the company's Mashup Server, Kapow will provide the ability to create data-rich mashups in minutes and then make that Web data ready for delivery into ubiquitous internal Microsoft Excel spreadsheets, or other enterprise applications and integration infrastructure.

Kapow OnDemand offers users access to a visual scripting environment for building the services and feeds that automates the access and delivery of web-based intelligence and data -- then delivers it the desktop or application of choice. According to Kapow, even Web-savvy, non-technical users will be able to build "robots" in a matter of minutes that can extract, transform, and output Web data.

The hosted service may provide the fastest way to deliver real-time data from the Web into Excel spreadsheets, and therefore into the hands of business analysts, business processes and for internal publishing feeds and streams. This will circumvent the old cut-and-paste logjam and allow analysts to rapidly collect market data on such things as competitive pricing, product mix analysis, or financial metrics, for example.

Despite a huge and growing amount of "webby" online data and content, capturing and defining that data and then making it available to users and processes has proven difficult, due to differing formats and data structures. The usual recourse is manual intervention, and oftentimes cut-and-paste chores. IT departments are not too keen on such chores.

But Kapow's OnDemand approach provides access to the underlying data sources and services to be mashed up and uses a Robot Designer to construct custom Web harvesting feeds and services in a flexible role-based execution runtime. Additionally, associated tools allow for monitoring and managing a portfolio of services and feeds, all as a service.

Deployed on a commercial-grade grid computing environment, OnDemand offers tight security, load balancing, high availability, failover, and automated backup and restore. Pricing for the service will begin at $3,400 per month.

Kapow this week will also announce its Connector for Excel, which allows spreadsheet users to find and execute Web services. By using Kapow OnDemand or the Kapow Mashup Server Web 2.0 Edition along with Connector for Excel, these users can bring XML content and Web services directly into their spreadsheets.

Kapow will offer a product preview Webinar on April 29, covering both OnDemand and the Excel Connector.

Last January, I sat down for a sponsored podcast with Kapow CTO Stefan Andreasan. He explained how much of the potentially useful data on the Internet exists in a form that is designed to be easily read by humans, and not by enterprise applications. [Disclosure: Kapow is a sponsor of BriefingsDirect podcasts.]
There's is a third group, which I call intelligence data. That's hard to find, but gives you that extra insight, extra intelligence, to let you draw a conclusion which is different from -- and hopefully better than -- your competitors. That’s data that’s probably not accessible in any standard way, but will be accessible on the Web in a browser. This is exactly what our product does. It allows you to turn any Web-based data into standard format, so you can access what I call intelligence data in a standard fashion.
Joe Keller, Kapow's chief marketing officer, explained to Computerworld the significance of the new OnDemand service:
By connecting [Web mashups] to Excel, users can have real-time data inside their spreadsheets along with their corporate data to get that 360-degree view of the data they are analyzing. If users can build spreadsheets, if they can do the programming of those spreadsheets, the plug-in makes [mashups] a native element inside of Excel.

Mashups provide that layer we need to really let the business do a lot of the work themselves. It still governs the services and creates the services, but it allows the business start doing business themselves.
Last month, Kapow raised another $11.6 million from investors, including Steamboat Ventures, Kennet Partners, and NorthCap Partners.

This service and the means to sidestep IT (in a good way) so that line of business decision-makers can avail themselves of all the data they can, regardless of its origins, begins the path toward solving the data management mess most enterprises are in. I expect to see many variations on this theme, with data access growing richer and varied -- but also with access and security controls.

As enterprises grasp the productivity that comes with public cloud data management, it may well spur them to bring more of their own data into the services layer where it can be delivered to where it brings the most value.

Sunday, April 20, 2008

Open source SOA infrastructure project CXF elevated to full Apache status

After community incubation and development for nearly two years, the Apache CXF open-source SOA and middleware interoperability framework evolved last week into a full project of the Apache Software Foundation.

CXF, with some 60,000 downloads since July 2007, takes its place alongside 60 other Apache projects. The framework began its life as Celtix, which was supported by IONA Technolgies in the ObjectWeb community, and then merged with XFire from Codehaus. It was later moved to the Apache incubator process.

CXF's graduation from incubator to project status involved widespread developer collaboration, taking it through six releases. CXF is now ranked among the top 10 Java software projects, receiving support from the Mule and JBoss communities.

It also serves as the foundation for IONA' FUSE Services Framework. Dan Kulp, IONA's principal engineer has been designated as the CXF project management committee chair. [Disclosure: IONA is a sponsor of BriefingsDirect podcasts.]

Nearly a year ago, I sat down with Kulp for a podcast on Apache and CXF. Here's what he had to say:
CXF is really designed for high performance, kind of like a request-response style of interaction for one way, asynchronous messaging, and things like that. But it’s really designed for taking data in from a variety of transports and message formats, such as SOAP or just raw XML. If you bring in the Apache Yoko project, we have CORBA objects coming in off the wire. It basically processes them through the system as quickly as possible with very little memory and processing overhead. We can get it to the final destination of where that data is supposed to be, whether it’s off to another service or a user-developed code, whether it’s in JavaScript or JAX-WS/JAXB code.

That’s the goal of what the CXF runtime is -- just get that data into the form that the service needs, no matter where it came from and what format it came from in, and do that as quickly as possible.
You can listen to the podcast here and read a full transcript here. IONA recently told fellow ZDNet blogger Paula Rooney that it intends to continue to invest in and support open source activities. And IONA is increasing its role in Apache.

As we now explore the fascinating intersection of SOA and WOA -- with on-premises services and cloud-based resources (including data) supporting ecologies of extended enterprises business processes -- I expect open source projects such as CXF to play a major role.Creating federated relationships between private and public clouds and their services and resources requires more than just industry standards. It requires visibility and access, the type that comes from open source communities and open use licenses.

I expect that open source code-based services and infrastructure will be the preferred choice for building the layers of an extended enterprise service ecology that binds organizations while protecting their assets and interests -- and which allows for trust and cooperation.

In a sense, open source SOA software is ready-made for extra-cloud oriented business processes and relationships. Perhaps one of the supporters of these projects will become a cloud host for integration as a service services?

Friday, April 18, 2008

TIBCO's ActiveMatrix earns 'Product of Year' nod from SearchSOA.com judges

TIBCO Software came home this week with a gold star when SearchSOA.com named its ActiveMatrix SOA platform a "Product of the Year" in the assembly and integration category in SearchSOA.com's annual awards.

The award was sweeter for the Palo Alto, Calif., company because it was based on products released before TIBCO has announced a beefed-up version of ActiveMatrix 2.0, adding new functions and a new enterprise service bus (ESB).

ActiveMatrix provides a single platform for developing, deploying and managing heterogeneous SOA. It includes capabilities for service integration, composite application development and governance. Expect more in the service performance management space from TIBCO soon.

TIBCO claims that customers using version 2.0 can achieve additional business process productivity gains, and can lower total cost of ownership compared to alternative approaches.

One member of the SearchSOA judging panel explained why ActiveMatrix got a top spot:
TIBCO has pushed the envelope with grid architecture here. It definitely helps in terms of achieving technology independence and it gives users a service platform that should be easier to scale. Most times you see "platforms" that lack any central organizing technology. This has it and it should enable users to deploy the functionality they need only when they need it.
I've been following TIBCO's upward trajectory for some time and, more than a year ago, produced a BriefingsDirect SOA Insights Edition podcast devoted largely to TIBCO and ActiveMatrix. You can listen to the podcast here.

Last year, TIBCO architect Paul Brown joined me for a sponsored book review podcast on the concept of Total Architecture, which ActiveMatrix 2.0 undergirds. Read a full transcript of the discussion.

[Disclosure: TIBCO has been a sponsor of my BriefingsDirect podcasts. I have also been a reviewing judge for SearchSOA.com product rankings.]

Thursday, April 17, 2008

Thought leadership grows around advancing 'WOA plus SOA' as enterprise-cloud duo

Respected developer, adviser and thought leader Dion Hinchcliffe has posted a watershed blog that develops a compelling rationale for Web Oriented Architecture's (WOA's) advancing role in enterprises.

The logic is not to supplant or dismiss Service Oriented Architecture (SOA), but rather to examine how WOA -- also known as lightweight, Web 2.0 applications development and deployment -- should provide an onramp to and stepping stone for SOA generally. WOA and SOA together -- in a harmony that unlocks both the power of cloud computing and of traditional enterprise architectures -- presents a very interesting future indeed.

Dion builds on recent posts by Dave Linthicum, Joe McKendrick, Tony Baer, myself, Phil Wainewright, and some reported findings by Burton Group’s Anne Manes. Many others have been also developing concepts and methodologies for providing the means for enterprises to exploit pure web resources for advancing developer productivity and business process extensibility.

Dion's right. Enterprises don't need to wait four years to build out and culturally align to SOA, not when they can proceed to WOA and continue on their long-term cadence toward building what IBM calls the federated "ESB backplane" for managed business services.

WOA simply allows for many productive SOA activities now -- without the huge investment, the wrenching cultural shifts, and the required several years of IT-business transformation. WOA plus SOA forges the mentality of managed cloud-based services and continued on-premises infrastructure exploitation right away.

WOA plus SOA for even modest B2E, B2C, and B2B business processes development and augmentation is just too good a deal to pass up, and it contributes to longer-term and perhaps more highly structured internal SOA infrastructure values and practices.

Enterprise marketers grappling with huge media and online outreach change, cannot wait years to gain the ability to foster, participate, share and satisfy the needs of socially aggregated communities. Sales forces can not go through IT and its SOA roadmap to combine data and market analysis effectively. Product designers can't managed a global supply chain using ERP alone. Procurement officers can't do more for less based on EDI alone. Integration can not be accomplished for business ecologies based on middleware designed for point-to-point EAI.

The crucial functions of sales, marketing, just-in-time supplier integration, and just-right procurement can't wait for SOA. They can make use now of WOA plus SOA.

As Dion says:
So if so-called Web 2.0 companies — which value participation almost above all else, both from consumers and organizations that want to integrate them into their offerings — are seeing highly desirable levels of adoption and significant ROI, how can this help understand how to improve our efforts in the enterprise? Most new Web 2.0 applications start out life with an API since getting connected to partners that will help you grow and innovate is a well-known essential for success online today. Despite years of SOA, we still don’t focus on consumption and openness as fundamentally essential characteristics to building an internal partner ecosystem that have beat a path to your door to use the services you are offering to them to build upon.
And as I've said, SOA lacks the political center of gravity and heft to spur adoption through grassroots demand. The critical constituencies of users/workers, sales, marketing, product development, and procurement -- and perhaps quite a few developers -- are not demanding SOA. It remains too abstract to them, while what they see possible on the web is tangible, understandable, seemingly attainable.

SOA may be the right thing to do, but ushering in its adoption broadly and deeply is proving arduous and stifles the expected ROI, which erodes the acceleration of further adoption. WOA plus SOA can help solve this.

WOA has evolved via massive scale trial-and-error, and so has been designed through viral adoption, user pull, self-help and with self-qualification of real-time productivity in mind. It works because it just works, not because it's supposed to work, or because someday it will work. As Dion says, "And these new models intrinsically take advantage of the important properties of the Web that have made it the most successful network in history."

Cloud providers and mainstay enterprise software vendors could make sweeter WOA plus SOA music together. They may not have a choice. If Microsoft acquires Yahoo!, there will be a huge push for Microsoft Oriented Architecture that will double-down on "software plus services." And MSFT combined with Yahoo would have an awful lot in place to work from -- from the device and PC client, to the server farm, business applications, developer tools and communities, and ramp-up of global cloud/content/user metadata resources. I think Microsoft already understands the power of WOA plus SOA.

Therefore Google, Amazon, Apple, eBay, and perhaps some of the traditional communications service providers and media companies will need to form natural and more attractive alternatives ... fast. There is no reason why IBM, HP, Oracle/BEA, TIBCO, and SAP should not seek out the consumer-facing cloud partner that can bring the WOA to their SOA.

They will need cloud partners that best further their business interests, and the productivity interests of their online clients and users. Microsoft will be offering some significant enticements along these lines -- and once again getting between the providers and the users, with a cash register going cha-ching, cha-ching all the while.

Enterprises and software vendors need WOA plus SOA, if for no other reason than Microsoft needs WOA plus SOA even more.

[Disclosure: HP and TIBCO are sponsors of BriefingsDirect podcasts.]

Wednesday, April 16, 2008

Desktop as a service era ramps up as Citrix marks May delivery of XenDesktop at a tough price to beat

Heralding a new era for desktop as a service (DaaS), the long-awaited XenDesktop line from Citrix Systems will become available during the Citrix Synergy 2008 conference in mid-May.

The XenDesktop product line is the latest entry in the effervescent virtualization market and will be co-marketed with Microsoft. With the XenDesktop, companies can virtualize entire Windows desktop instances from their data centers and deliver them on-demand as a service to any workers with a web browser and broadband.

General good guy and fellow ZDNet blogger Dan Kusnetzky has a post on XenDesktop too.

With current PC software management, updating and upkeep costing some $5,000 per user seat annually (and that's low for many companies), according to some industry experts, Citrix is predicting a saving of up to 40 percent in PC desktop TCO using the virtualized server delivery and on demand UI approach.

Because the Citrix DaaS system dynamically assembles each user's personal desktop image from fresh software components each time the user logs in, updates and upgrades are seamless -- for the operating system and applications. Management centrally saves from help desk calls and visits to the physical location of the PCs. Combined with remote access, PCs may never feel the warm touch of an IT admin or help desk steward once its out of its shipping crate, if then.

The system separates applications from the desktop OS and provisions them independently at runtime from new master images. Each time they log in, users get a fresh desktop that is more secure, personalized, and free from corruption and conflicts. A high-speed delivery protocol provides instant access to desktops and applications over any network, no matter how far users are from the data center. That means near-instant boot-ups.

From an administration viewpoint, managing desktops and applications separately, and combining them only at runtime, allows IT to maintain a single master Windows desktop OS image for all users, rather than fully loaded desktop virtual machines for each employee. This also simplifies lifecycle management. And Microsoft still get its CAL, but saves the enterprise on the PC TCO at the same time.

Why didn't they think of this sooner? Guess MSFT had other things on its mind other than helping out its customers with the high cost of computing. I guess we're glad Citrix and others forced their hand.

And so how about that $500 PC thin client market, eh? I'm Wyse to that. HP likes it too.

What's more, Microsoft is also perhaps embracing DaaS now so to diminish the market opportunity for non-Windows thin clients, ans the DaaS delivery of Linux and Unix (Open BSD) offerings. How much cheaper would non-Windows PC TCO be when the delivery via DaaS? Probably not enough to energize that market. There's always the SaaS and cloud markets, however, to keep MSFT busy on the commoditization front.

I tagged Citrix and its desktop virtualization as a powerful market disrupter about six months ago and it seems to be living up to its promise. As I said then:

When you combine virtualization benefits up and down the applications lifecycle -- with such functionality as back-end automated server instance provisioning -- you get excellent cost controls. You get excellent management, security and code controls. And you marry two of the hottest trends going -- powerfully low TCO for serving applications at scale with radically simpler and managed delivery via optimized WANs (NetScaler Web application accelerator) of those applications to the edge device.

The desktop as a service market has been bubbling vigorously lately. Just over a week ago, MokaFive announced its desktop virtualization product, which combines cloud computing with local execution.

And I expect news any day now from Desktone, which is aiming many of the same values that Citrix is delivering with XenDesktop -- but to service providers so that an ecology for DaaS can develop to homes and small businesses (and maybe enterprises too). The Desktone approach gives the tools to deliver DaaS to, say, your telephone company so they can offer a PC as a service at a flat fee per month. More on that later. Incidentally, Citrix is an investor in Desktone, so they see eye to eye on a lot of this.

The XenDesktop comes in three editions: Platinum, which offers the most flexible user access, performance monitoring and quality of service (QoS) capabilities, and remote virtual desktop support; Enterprise, an integrated system for cost-effective scalability; and Standard, the entry-level product.

Pricing will begin at $75 per concurrent user. Details about requirements can be found at the XenDesktop Web site. Until May 20, XenDesktop is available as a public beta and can be downloaded from the Citrix site.

Tuesday, April 15, 2008

SOA and compute clouds point to rethinking data entirely: roles and permissions, not rows and tables

There's nothing like a conversation to bring out insights and new ideas. Tony Baer and I were chatting on this very pearl of productivity last week, that an open roundtable analyst call always allowed us to move the needle forward in terms of thought leadership in ways that solo writing and one-think cannot.

And then there's Twitter, which is sort of a blend of a roundtable chat, solo writing, one-on-one and blogging -- all to and with a refined, social graph-defined audience. And it was in this act of Twitter-thinking this morning that I slid into a new realization, new for me anyway. It has to do with data, and the need for multi-permeable access to data, across organizational boundaries, stored in many places, that both protects valued and proprietary data while breaking processes out of the corporate IT straight jacket.

I'm seeking better understanding of how cloud (public and private)-base webby apps can and should be a big part of SOA (especially greenfield services), and may even become the driver for rapid use and adoption of SOA. I'm also fascinated by PaaS, SaaS, IaaS, AWS, GAE, and the services ecology development from both small and large providers. And I know that social networking and new media will play into big business in a big way.

These issues have huge impact on many vendors, from Salesforce.com, Amazon, Microsoft, IBM, Google, Yahoo!, Oracle, SAP, and HP, on down to a burgeoning class of startup-class of cloud capabilities supporters. Notice how I lumped together older-style enterprise IT vendors and "consumery" services providers?

The hang-up on understanding how these two worlds -- public clouds and enterprise IT (soon private clouds) -- come together hinges back to who/what handles, owns, manages and offers secure access to (or not) the data.

It's clear that applications and logic can become services, atomized or aggregated. User interfaces can be delivered as a service, as parts of apps or as entire desktops. Same for mobile access.

Of the various levels of abstraction of what goes into IT-based activities, almost all can be deconstructed and more productively delivered as thick, thin, or mixed (software plus services) webby apps. Agile business processes are the new coin of the realm. The days of standalone, PC-based apps are over. Off the wire services will make up more and more of what we will soon colloquially refer to as productivity applications.

But then there's the data, still strapped into a definition that associates it to applications, even as applications as we know them are evolving dramatically. After talking with Dave Linthicum, now CEO of StikeIron, on his vision ... and Kirill Sheynkman, president and CEO of Elastra (as well as many others in the Enterprise 2.0 space) ... I'm now convinced it's time for some radical re-thinking on data, and what it is that it should actually be related to and associated with.

Perhaps it's time to fully divorce data from applications, and wed it all instead to people and groups, guided by roles and permissions, and therefore no longer co-located with applications or even enterprises. House it where it can be used easiest, and stored and protected cheapest. It's heresy today perhaps, but more of the data that matters will be in the cloud anyway.

We always think of data as tied to an application, or in a managed store (often times distributed) that applications read-write to. The store can be analyzed, protected, backed-up, consolidated and cleaned up, messed up again, poked and prodded. We still think of data belonging to some store, and by association, to a department or corporation or IT sys admin.

Data is controlled and managed as centrally as possible, except that it's always scattered and inconsistent. A battle rages in every enterprise as it tries to manage and control its data. It's a losing battle that is costing more and more of the IT spend pie each quarter. And there are many good reasons for this battle -- except that it's holding us all back.

It's now clear that the current mentality of data and its place is holding us back in unacceptably unproductive ways. In this day and age, you will never control your data at the margins. Those margins used to be PCs and departmental servers. Now they are becoming clouds, social networks, free web email apps, Twitter. The data that impacts and drives your company is a complex system not unlike the weather, or quantum-level particles. Try and grab and control that rainbow, if you will. Probability rules, not exactitude.

CXOs can define the happenings of their enterprise by the audits, create the ledgers, and fill up the financials data warehouses. But so much more is going on beyond the glass rooms. The best that CXOs can hope for is to approximate the state of most actual enterprise data, and even that leaves out what is happening in the social media domains, where the innovation, customer feedback and process insights are often occurring.

All those hard drives, all those iPods, all those memory sticks. All the metadata of what your workers do via the web -- out in the murky clouds -- it is all out of your reach. This is already the case, and it will not be a genie you can back in the bottle.

And the choices for retrenchment? Close off your workers from access to web search? Deny them access to your suppliers' portals? Erect firewalls that separate your customers, clients, and prospects from your own sales, marketing, and fulfillment providers? YOU CAN'T MOVE AT INTERNET TIME WITHOUT BEING ON THE INTERNET.

In other words, the ways we treat data today is unnatural. And pretending otherwise is unsustainable. There is a huge productivity opportunity for those that can re-think data, from concept to execution, that can exploit the gathering clouds. This requires radical rethinking, I'm pretty sure, though I can't say I know what the new data landscape will look like.

It will take a social capital-level ecology of complex systems in a pattern of ongoing churn for the answers to arise and quickly become outdated. There are deep and fundamental disruptions under way in media and software now that will force the hand on data. If you don't use what's free from the web and clouds, and your competition does, then what?

So what comes next? There's this murky middle muddle now between public and private clouds, SOA, independently located data, master data, and tools and development. If Google App Engine, Amazon Web Services and the near-certain follow-ons from Microsoft/Yahoo, HP, Oracle, IBM, and perhaps a handful of other major players have even a minor impact -- data conceptually is deeply changed. The cloud providers will give way the tools, single sign-on ID management, runtime, storage, and they will probably let you keep the data that you think is your data.

What enterprises will no longer have is the control of the market data from the public clouds about the users, the groups, their behaviors, attention, their demographics, buying patterns -- all the service on-ramps and off-ramps to your enterprise's revenue growth. Private clouds may not alone be enough to reach mass or long-tail audiences. Will your database of users, your email blast list, your CRM data be as good as Google's, or Microsoft's? Can you sell and market your goods and services without using online ads? Who will sell them to you?

If you're thinking of data as associated to internal applications, as the read-write store for ERP activities, or as the list of your sales leads and prospects in CRM -- you may want to think again. Your data and the clouds' data will need to work together, and perhaps those that bite the bullet and leverage the public cloud's to the hilt will have a huge advantage over those that do not.

Take a hard look at the diagram on this blog by Dan Farber, called "Google's Vision." Salesforce.com and Google think of data quite a bit differently than you do.

And just as you are in cozy partnership (like you have a choice) with your ERP or SOA vendors, enterprises and businesses of almost all sizes will be in partnership soon with the clouds. The shared and protected data alike will be scattered about, too complex for closed marts and masters. The cloud that can manage data in way that allows both user-level and process-level access, with granular permissioning -- and allows CXOs to feel good about it all -- gets the gold ring. The cloud business is a 50-year business.

How should your data be treated in this new cloud era?

Saturday, April 12, 2008

Google App Engine creation process live on Twitter

I'm watching Dion Hinchcliffe and a small group of other observers and developers create a Google App Engine (GAE) application live via Twitter.

We're on the cutting edge of using social media and near-real time collaboration tools (free) to learn and use (GAE) for free, and then blog on the process (also free). The price is obviously right, and the ease and transparency of sharing and witnessing are just about friction-free.

As Dion points out (and Dan Farber makes note), there are trade-offs between GAE and Amazon Web Services. And there are concerns to be evaluated and vetted over the application lifecycle remaining in the Google cloud, as Garett Rogers makes note.

But the process I'm witnessing here on Twitter is nothing short of breathtaking for its rapid, agile and productive online team approach (we are located all over) to web app development. Other Google services could be used, too, like Groups. And, of course, developers are well acquainted with other forms of collaboration such as CollabNet.

If even for minor apps, services, or for prototyping development of subsets of large projects, this is all very compelling. I'm fascinated by how developers will use GAE within existing projects and processes. GAE will not be used in isolation, I suspect, but will be a powerful tool in the WOA quiver. And that may also prompt more use of GAE as the end-all, be-all for more an more apps.

I know a lot of people use Amazon as a test bed for their apps. Google App Engine will be very attractive for that too.

But what Google can soon bring to the table is an ability to put these apps and services in front of a ton of other developers and huge potential audiences of end users and consumers. Google has clout of scale, metadata and reach that Amazon does not.

Like others, I hope that Google adds more tools to Python on GAE, like Ruby. I also hope they find a way to port parts or all of the apps off of GAE. Perhaps for a cost, you could choose to not only deploy via the Google cloud, but perhaps get the basic script and code for extraction and use elsewhere, or for mixed-purpose development.

Bungee Labs has that option, that the developers' own the code IP and can take the apps elsewhere. [Disclosure: Bungee Labs is a sponsor of BriefingsDirect podcasts.]

Wednesday, April 9, 2008

SearchSOA.com names Nexaweb's Enterprise Web Suite as 'Product of the Year'

Nexaweb Technologies, the application platform provider, just keeps piling on the accolades. Its latest achievement is the Product of the Year award for its Enterprise Web Suite from SearchSOA.com.

The gold-status award for the Burlington, Mass. Company, came in the RIA /Composite Application Assembly category, and was based on innovation, performance, ease of integration into environment, ease of use and manageability, functionality, and value. [Disclosure: Nexaweb is a sponsor of BriefingsDirect podcasts. I have also been a reviewing judge for SearchSOA.com product rankings.]

In January, Nexaweb received the Editors' Choice Award from CMP's Intelligent Enterprise.

Just over a month ago, I blogged on Nexaweb, and its role in helping enterprises modernize without pain, confusion or excessive cost.

For Nexaweb, the end game for enterprises is flexible composite workflows, and so the newest offerings are more than tools and platform, there’s a professional services component, to take the best practices and solutions knowledge to market as well. The process includes applications assets capture and re-factoring (sort of like IT resources forensics), re-composition, deployment and then proper maintenance. In the bargain, you can gain an enhanced platform, increased automation, and services orientation.

The goal is to harvest all those stored procedures, but target them to newer architectures — from Struts to Spring — and move from client/server to Enterprise 2.0, is a leap-frog of sorts. The re-use of logic then allows those assets to be applied to model-driven architectures and the larger datacenter transformation values.

Nexaweb Advance pairs Nexaweb’s Enterprise Web Suite with automated code generation tools and professional services to deliver a model-driven architecture approach to the transformation of legacy PowerBuilder, ColdFusion, C++, VisualBasic, and Oracle Forms applications . . .

Nexaweb says that their Enterprise Web Suite allows companies to extend and modernize legacy applications, while leveraging their investments in J2EE and services-oriented architecture (SOA) platforms. The company claims that organizations using their offerings have reduced application development, deployment, and maintenance costs by an average of 50 percent.