Friday, December 7, 2007

Bloomberg says mashups need governance as well as an easy development environment

Jason Bloomberg, a ZapThink analyst, is planning to hold a session at SOAWorld 2008, where he will discuss the need for mashup governance when developing Service-oriented business applications. (SOBAs) I won't be at his session, or even at the conference, but I read what looks like his abstract in the press.

Bloomberg's conclusion: mashups need governance, and here's why.

Way back in the day, when PCs first made their way into the business, we developed many and varried applications. These apps were complete one-offs and at some point they became too expensive for the business to maintain. The business dropped these apps right back in ITs lap. IT proceeded to toss many of them, replace others and rewrite still others. The apps didn't have legs for long-term usefulness. Their data sources were siloed, their interfaces inconsistent and their development environments primitive.

Now we have a new wave of business-built applications, or mashup-based SOBAs in Bloomberg's terms. If we don't want to repeat our past mistakes, we need to make sure this new crop of applications is architected for the enterprise. They will need to be loosely coupled to back-end systems, employ rich interfaces akin to those now common in the consumer world, and have a development environment that will make it easy for the original business mashers to maintain them as business requirements evolve.

To this Bloomberg adds,

"...Clearly, no business would risk allowing any of its employees to assemble and reassemble business processes willy, nilly, with no controls in place to ensure that the resulting SOBAs followed corporate policies."

The problem with this statement is that employees are assembling and reassembling SOBAs willy-nilly. The lack of controls to make sure these applications follow corporate policies is one of the very things that make fast innovation possible. The ungoverned free-for-all of mashup construction is part of the mashup DNA.

So mashers are mashing, not just in the wilds of the uncontrolled web, but also behind the firewall. We don't want to stop them, and we can't trust them to self-govern. Not because they are evil, lazy or stupid, but because they should be busy innovating, not fighting a version control system. Bloomberg seems to agree, and he goes on to say that when governance is put in place the mashup environment starts looking less like a Web 2.0 environment and more like a standard development environment.

Not necessarily. The trick is to integrate light-handed governance with a mashup construction environment that looks and feels like a MS Office application, or better yet, iTunes. Certainly not like Popfly or Eclipse. But we also want mashup-based SOBAs to have legs for the long term. So we need security. We need version control. We need change management. What we don't need is to discourage our would-be mashers by making governance obstructive. Governance should be a silent side-effect of building a mashup. A Side-effect ignored right up to the point where somebody accidentally honks up one of the production mashups.

Is this a mere dream?

Yes and no. There are some mashup construction tools on the market that incorporate version control. There are some that include security. There are some that include change management. Unfortunately, there are none yet that include all the old application development disciplines. So yes, it is a dream for the moment.

However, the fact that some mashup enablement vendors are already thinking about governance tells me that the tools will evolve, and likely very quickly. So maybe this isn't a dream after all. Maybe it's just a product roadmap issue. Unlike our first go-round with business-built applications, we have some experience with what does and doesn't work. This time around IT should be on hand to provide a safety net rather than a brick wall. This time things just might be different.

If we've learned from the past we will create an environment where business-built applications will remain with the mashers, to wax and wane and ebb and flow right along with the business.

Wednesday, December 5, 2007

Business Mashups already are process aware

Today I read a post by ebiz blogger Keith Harrison-Broninski about Mashups and process. He took a look at mashups from two perspectives represented by mashup tool vendors JackBe and NexaWeb.

(As an aside, do you think JackBe allowed him to evlauate their tool? Readers will remember that JackBe declined to let me evaluate Presto, as I've noted here and here.)

I'm in the middle of evaluating Dapper, and so far I really like what I see. I had to take time out to respond to Harrison-Broninski, however, because while I agree with most of what he says, I have to take exception to his conclusion.

Here is the money quote from his post.

...there are essentially 2 types of mashup: siloed and process-aware. For now, mashup tools are siloed. However, in due course they will become process-aware via integration with HIMS technology - and once this happens, the combination of [Human Interaction Management System] HIMS processes and mashup applications will be an extremely powerful way to leverage both Web 2.0 and your legacy infrastructure.

I'm very confused by this post because we've been saying for months that true business mashups are not only human-process aware, they are human-process centric. Mashing up the data is a great step forward. Mashing up visual and data elements at the glass is a very interesting technique, perhaps more relevant to the consumer world than the business world, but a useful technique nonetheless. However, these aren't business mashups. They are presentation and data mashups.

So what?

Hinchcliffe noted in a recent and widely cited post that there aren't yet killer mashup apps. I believe that's because we've concentrated only on putting the data and presentation components together and haven't put the results in the context of a business activity. I go into more detail in my response to Hinchcliffe, so I won't repeat myself here. The bottom line is that Harrison-Broninski has it exactly right about the need for human-centric process.

What he doesn't have right is his assertion that there are no process-aware mashup tools. He notes that BPM is a possibility, but thinks processes appropriate for BPM aren't really appropriate for mashups. Read his post for his perspective on the difference between BPM processes and HIMS processes. I'm not sure I agree with that conclusion since there are BPM vendors who have strong roots in human workflow. In fact, Forrester has an entire wave for Human-centric BPM. (I'd include a summary here, but then Forrester would insist on approving my post.)

Instead, I think the reason BPM tools haven't invaded the mashup market is because BPM practitioners haven't yet made the connection. (See my post on the subject.) Until that happens the tools won't evolve to make it easy to create mashup applications within a BPM context. I'm sure that will change, but for now read the comments from long-time BPM practitioner Sandy Kemsley.

Until the BPM tools do evolve, I'd like to point out that there are at least two process-aware mashup tools already on the market. The first is ActiveGrid, which I've already reviewed. The second is Serena Mashup Composer which has been released to the public this week. Both of these tools let users create mashup applications, and both enable mashers to define human workflow to put the mashups in the context of a business activity.

Go read Harrison-Broninski's post for an interesting discussion about the difference between BPM and human-centric applications. He makes a great case for mashups of the future incorporating HIMS. But don't think you have to wait for 'the future' for a business mashup construction environment. To steal the title of a very old movie shamelessly stolen in many, many marketing taglines, "The future is now."

Friday, November 30, 2007

What's true for blogs and wikis is true for mashups as well

In a recent post, Harvard Business School professor Andrew McAfee discussed findings published earlier in Andy Dornan's Business Week article about the perceived value of Web 2.0 technologies within enterprises. I've already commented on Dornan's article, and won't do so again, but I do want to comment on McAfee's insights relating to Business Week's findings.

Those of you who have already read McAfee's post might wonder what his article has to do with mashups. McAfee's comments relate specifically to legacy Web 2.0 (yes, my tongue is well into my cheek) capabilities such as wikis and blogs. He also extends his insights to social networking sites. (SNS) Although he doesn't specifically discuss mashups, I think what he has to say applies equally well to them. Mashups intrude on application development's turf, and app dev is one of the most isolated and siloed organization within IT. In McAfee's experience, people believe "...that corporate IT departments consciously exclude outsiders and outside influences, and are concerned primarily with expanding themselves." That's true in spades for applications development.

Here's some more of what he has to say.

Enterprise 2.0 tools have no inherent respect for organizational boundaries, hierarchies, or job titles. They facilitate self-organization and emergent rather than imposed structure...They require [grizzled R&D types] to...“practice the philosophy of making it easy to correct mistakes, rather than making it difficult to make them”...They require, in short, the re-examination and often the reversal of many longstanding assumptions and practices...

Do Mashups and mashers respect organizational boundaries, hierarchies or job titles? Nope. In fact, mashers don't even respect IP. It's the essence of a mashup that the masher pulls information from where she wants in a way she wants to create the application she wants. With in-the-cloud deployment models and screen scraping tools, she's not going to care whether IT likes what she's doing. She's not going to go through a project approval cycle. She isn't going to get permission. She's just going to do it.

Do mashups facilitate emergent rather than imposed structure? You bet. I like to say that blogging, combined with Google search, lets anyone become a pundit. You don't need to work for Gartner or Forrester, or even eWeek. Eventually the bloggers with something interesting to say will get found and heard. The same is true for mashups. With business mashup technologies putting the means of production into the hands of subject matter experts, anyone can be a masher. The good mashups will be found and used, turning mashers into valued experts. These mashers don't need to work for IBM, BEA or even your IT department.

Do mashups promote the philosophy of making it easier to correct mistakes rather than harder to make mistakes? They sure do, and this is the very issue that drives IT nuts. For years IT Ops has been focused on keeping the production environment secure, even at the expense of business agility. It's a long and difficult process getting any change into the production environment. IT isn't being obstructive for no good reason. IT has a long history of having their butts firmly kicked for allowing problems to creep into the production environment. Their quarterly bonuses are probably tied to system availability and reliability, not to innovation. App dev organizations have a similar focus. We read about business disasters caused by faulty software. Software defects, found in production, are 100x more expensive to fix than when these defects are found in the requirements or design phase. Application development organizations are told to make sure the software is bullet-proof , defect-free and completely stable before it goes out the door.

Yet with mashups we are asking the business to embrace the notion that software will almost certainly be flawed, but can always be fixed. In IT's mind this philosophy is no different than the bad-old hacker days when cowboy programmers working on the enterprise's 'big apps' could bring down an entire business unit's operations. No wonder application development organizations believe end-user created mashups are a bad idea. What app dev isn't quite yet understanding is that a mashup can be fixed in minutes or hours, whereas 'big apps' take months or perhaps years to fix. Changing the 'big app' mindset will definitely require "the re-examination and often the reversal of many longstanding assumptions and practices."

McAfee concludes that IT had better wake up and start delivering innovative solutions for the business, or they will find themselves taken out by Web 2.0. I'm not quite ready to predict that level of gloom and doom. I will certainly agree that IT needs to evolve. They aren't going to block adoption of Web 2.0 capabilities, to include mashups. What they can do, as I've discussed before, is provide a secure infrastructre and behind-the-scenes mentoring so these new technologies won't harm the business.

Thursday, November 29, 2007

Coghead lets you build simple web apps, not mashups

A colleague of mine last week suggested that I review Coghead as a mashup assembly platform. I've been spending the week with them and can conclude that they are neither a good nor a bad mashup platform. In fact, they aren't a mashup platform at all.

Coghead allow its users to construct simple web applications based on a back-end database. Both the application and the database are hosted by Coghead itself. They have an on-demand pricing model that is quite reasonable, with the price of a subscription increasing with the number of database records. For simple web applications in the long tail of applications development, web applications that don’t need to mash either data or visual elements, Coghead could be a very reasonable solution.

On the surface, Coghead offers capabilities similar to ActiveGrid, but only on the surface. While ActiveGrid also lets users create a web application driven from a back-end database, unlike Coghead, ActiveGrid allow users to create mashups both by exposing services and by calling out to them.

Let's start with what Coghead can do. I found it very easy to construct web forms, create search result lists, define business logic and otherwise manipulate database records. I had a fairly sophisticated app up and running within a couple of hours of signing in. The documentation is excellent, and the tutorials instructive. You really don't need to be technical to create a Coghead application. As with ActiveGrid, it's helpful to understand how relational databases work, but not absolutely necessary as long as your application is simple and doesn't require complex relationships between tables.

There were a few things I didn't Like about Coghead. It was easy to get the web forms set up, but it wasn't easy to get them just right. I'd say that most form designers will spend 90% of their time tweaking the last 10% of the form. Form tweaking and other minor problems were just annoying. The most pressing problem with Coghead was its performance. I've got an excellent connection here at the office, and I had only a small app on Coghead, yet creating, updating, and deleting records was slow, slow slow.

All this is in the noise, however. Because Coghead is not, repeat not, a mashup construction platform, business or otherwise. The only content they aggregate, and by aggregate they actually mean replicate, is by way of a csv file. That's a good choice for a database integration conduit, but it isn't a mashup. It's replication.

You might be wondering about coglets. Surely, you might think, coglets enable mashup construction. At least you might if you were on Coghead’s site and read about coglets. And what about mashouts. Don’t they count?

Sorry. Coglets are by no means mashup fodder. Coglets merely create individual web pages to display fragments of your application's data. I can create a single page, accessible through a URL, to display a single record or to display a view. I can then embed the URL into another web page. That isn’t a mashup. That’s at best a portal and at worst just a custom web page with a lot of iframes.

Mashouts go in the other direction. Mashouts allow you to use data from the current form as variables in HTTP get requests. These requests get triggered from the web form's context menu. (Web forms with context menus...you gotta love AJAX.) The target for the request is another browser altogether. That's not even a portal. It's a link to another application.

Let’s look at an example.

Let's say I want to have an application that consolidates all the tasks I have assigned to me. Let's also say that I get tasks from a project management system that publishes a project plan to the web, my Outlook TODO list, and from an issue tracking system used by R&D to assign maintenance tasks. There are at least four ways I could interact with these different task lists. First, I could jump from app to app to app. That’s not a mashup. It isn’t even an integration, but it’s the sort of capability provided by mashouts. You can jump from a Coghead app directly to another app in another browser. It’s our old siloed applications we know and love so well.

Second, I could copy all the data from the three apps either into a single database, or designate one of the applications as the master TODO list. Then I could access all my tasks from a single repository, hopefully with two-way replication so changed TODO data can be fed back into the original app. That isn’t a mashup, that’s a replication integration, one that has all the inherent problems associated with replicated data. These sorts of integrations are enabled through Coghead’s csv import/export feature.

Third, I could create a web page where I pull the content from the three apps into the single page. In one frame I’d see the project tasks, in another my Outlook tasks, in a third my issue management tasks. This is what you get with coglets. That’s not a mashup, that’s a portal.

Finally, I could construct an application that aggregates the information from all three applications into a single unified GUI. I could reference the data but not copy it, so all three applications retain their single copy of the TODO data. I could create a unified GUI so that, unless I drilled down on a specific task, I wouldn’t even realize the data were federated across three repositories. Were I to take this approach, I’d have a mashup. Unfortunately, there are no capabilities within Coghead applications that support this approach. At least not yet. Their documentation says they will have REST services available soon, meaning that while Coghead could participate in a mashup, they still wouldn't be a mashup platform.

Bottom line: Coghead can certainly help non-technical users develop web-based applications in the long tail of applications development. For the 80% solution, Coghead makes it easy to create, deploy and even pay for those pesky applications for which there are no IT resources. It won’t be a mashup, but it could be enough for you.

Friday, November 16, 2007

Have fun with Mashup Composer, JackBe

I'm not one to hold a grudge. Hey, when JackBe said I couldn't download and try their software, I understood. I didn't like it, but I understood.

But let's play fair here.

Someone from JackBe has downloaded Serena Mashup Composer. If I were Fake Steve I might have a field day talking about how, now that they've had a chance to dig into Composer, JackBe must be crying like a girl. If I were Fake Steve I would be suggesting that the JackBe employee is actually getting ready for a job interview with Serena, and that's why he/she downloaded the software. If I were Fake Steve I might even suggest that this is JackBe's way of writing requirements for their next release. "Hey R&D, forget the requirements doc. Just make Presto do what Serena Mashups can do."

I'm not Fake Steve, so I wouldn't say any of these things.

I will say this, however. Presto and Serena Mashups solve different problems. Do you need your data mashed? Presto likely is a good choice. I wouldn't know for sure because I haven't been able to test it, but so I'm told and so I suspect based on the Presto marketing materials. Do you need this mashed data within the context of a business activity? Well, Serena Mashups may be the right way to make use of this mashed data.

The point is that JackBe and Serena should be partners, not competitors.

So JackBe, we've shown you ours. Will you show us yours?

Thursday, November 15, 2007

Who will be the business mashers?

One of the top two leading industry analyst firms has been talking for several months about how software applications are brittle, difficult to maintain and aren't architected to keep up with the fast-paced demands of today's business environment. That's not new news. It isn't even old news. What is news is that now we have analysts saying out loud what we've all known for quite a while: custom software development is broken.

I can't tell you which analyst firm has been talking about this issue because they will insist that I run this post through their vendor relations department. I don't really want to have my blog sanitized, so instead I'll just call them the Unknown Analyst. (UA)

UA has suggested some solutions to this problem. They predict more development teams will adopt agile software development practices as a way to make app dev more responsive. That will help, but agile methods will only take us so far. Even with agile development practices, multisourcing, offshoring and otherwise scaling up capacity, app dev resources are scarce and expensive. IT has been scaled to work on big-ticket and highly complex software projects, not the myriad interesting and potentially business transforming smaller applications that are backing up in app dev's queue.

Regardless of the development methodology employed by the app dev team and how many new developers they hire, IT simply isn't going to be able to satisfy the demands of the business for new and innovative applications. Especially when, as UA says, they need to change at the speed of business: constantly.

UA suggests that this gap will be filled by business analysts. The role of the business analyst will need to change to become as much someone who assembles applications as someone who simply defines them on behalf of the business. In our world this means the business analyst will turn into the business masher.

So where do business analysts come from, and will they have the necessary skills to become business mashers? What skills will the business masher need?

Good questions, both. UA says that business analysts are grown, usually from an IT staffer who is interested in business or more likely from a business expert or project manager of some sort who becomes interested in IT. Today, the most important skills business analyst posses have to do with effective communications. Can the BA communicate effectively both with the business and with IT? Does the BA know how to write, analyze a problem and mediate disputes between what can be two opposing forces?

Technical skills such as process modeling, usability engineering, and data modeling are less important. Likely because those skills can be learned, but knowing how to communicate effectively both to the business and IT isn't something that can be picked up out of an instruction manual.

But are the soft skills currently important to BAs going to be adequate when these BAs have to start turning out applications?

It's a bit of a rhetorical question because I think I have an answer already. No, these skills won't be enough. I've worked with a number of BPM tools in the past. I've also worked with a number of Web 2.0 application assembly platforms, as you will know if you've read my reviews of Intel's MashMaker, Kapow's RoboMaker, Alpha Works' QEDWiki, WaveMaker's ActiveGrid and even Serena Mashup Composer. These tools all either don't actually build applications, MashMaker, Mashup Editor, QEDWiki and RoboMaker fall into this category, or require some strong IT know-how when it comes to the deep-down nuts and bolts of web service calls.

Yes the tools will get better, but for now I can't agree with UA's assessment. Today's BAs aren't going to become tomorrow's business mashers. Most of them lack the necessary technical skills.

I want to digress a bit. A little over ten years ago there were a lot of complaints in IEEE circles because engineers would found companies only to have them run, owned and grown by MBAs. (Microsoft excepted, of course.) Many IT departments were run by MBAs working for the CFO rather than by technical people with knowledge of technical issues. We nerds put ourselves in technical silos, and then complained like crazy about the lack of technical know-how among our top executives.

IEEE put out a call and said, basically, "Quit complaining and go get MBAs." In other words, IEEE decided that if we nerds wanted leaders with strong technical knowledge, we'd have to grow these leaders from our own ranks. It was a great message and got a number of us to start worrying about things like entrepreneurialism, sales, marketing and finance in addition to whether Windows or Unix would end up ruling the world.

I think we have a similar chance to expand our horizons today. The BA role is changing. BAs today need soft skills, but the BAs of tomorrow will need more technical know-how. Most of these BAs come from the business side of the house and have learned about IT on the job. That means when these newly minted mashers start to create mashup-based business applications they will make mistakes. They won't necessarily think about version control. They won't know about consumer-side SOA Governance. They won't understand how to use XPath, call web services or in many cases how to set up a feedback system to manage changes to their applications.

Remember the last couple of times business built their own applications? Remember the Access databases from the 80s? Remember the web applications in the 90s? The business needed agility and they got it, but they also needed engineering discipline, and this they didn't get. In the end, when it became clear that these applications were business critical, they ended up back in IT's lap, and both IT and business lost. IT because they became villains. (Business: remember when we could make changes whenever we needed? Now everything takes forever.) Business because they lost their agility. (IT: those business guys don't understand that we have 400 things on our TODO list, and the capacity to do five of them.)

We don't want to repeat the mistakes of the past if we can help it. If we technical people want to keep control over our businesses, we have to learn about business. If we don't want to repeat the problems we had with business built applications in the 80s and 90s, we need to step into BA roles now so we can bring our hard-learned disciplines to bear on this new application construction paradigm.

Calling all IT application developers! Why not take the plunge and become a BA? Walk away from Eclipse and give up Java. (Although you will probably still need javascript.) Learn what makes your business tick. Read "Crossing the Chasm." Listen to a marketing podcast. Change those sneakers for loafers, throw a sports coat over that t-shirt and go apply for the BA job in sales your HR department just posted.

This time around, let's do it the right way.

Wednesday, November 7, 2007

ActiveGrid is useful for web-based application building, but like so many mashup tools, you've got to have a fair amount of technical know-how

You'll notice from the title that I'm going to review ActiveGrid by WaveMaker. I wanted to review either Denodo, JackBe or AlchemyPoint by Orchestr8, but both Denodo and JackBe declined to allow me to evaluate their software, and Orchestr8 needs to publish some more documentation before I can do a good evaluation.

Why couldn't I test drive Denodo or JackBe?

Denodo declined because they sell enterprise middleware that requires some hand-holding to install and understand. They don't allow web downloads at all, let alone downloads by potential competitors. They did offer me the chance to participate in a directed demo, but having been in the software business for a long time, I just don't trust demos. I need to dig in and get my hands dirty. I want to thank them for making the offer, however.

JackBe was more straightforward. Serena is a competitor, and they aren't knowingly going to hand over their software to someone who could have malicious intent. That would be me, I suppose.

Fair enough. I'll just have to sign up under a different email address and company name and get my hands on it that way. Perhaps I'll use my Second Life avatar. Better watch out for a download request from Chuck Malibu, JackBe. (Note: This is a joke. I would never do such a thing, even to support my blogging habit.)

So while I can't talk to you about Donodo, JackBe or AlchemyPoint, I can talk to you about a surprisingly interesting web application building tool called ActiveGrid. WaveMaker is the company name, although as little as a week ago the company name was ActiveGrid, just like the product name. I'm not quite sure why they changed their name, but as long as they didn't change the tools, this review should still be accurate.

What is ActiveGrid?
ActiveGrid advertises itself as a Web 2.0 platform (who doesn't?) allowing non-technical or business-oriented people to build their own AJAX-based RIAs and mashups. Some of that is true, and some is a bit of a stretch. It's true that you don't need a lot of technical know-how to build data-based applications with highly interactive AJAX-driven interfaces, although some familiarity with the workings of a DBMS is necessary. When I started to build mashups, well that required a bit more technical know-how, which I'll talk about later.

ActiveGrid's application model is based on a database with a web front-end driven by a BPEL process engine. It reminds me strongly of the old Access-driven applications we built back in the 80s and 90s. Start with a database, add some user interfaces and some Visual Basic to manage the business logic, and there you have it, a home-built application. Of course, ActiveGrid is quite a bit more sophisticated.

I used ActiveGrid to build an application that used the various RSS feeds and REST interfaces I defined in OpenKapow. (See my earlier post.) It was easy to assemble the screens as long as I used the defaults. They do a fairly good job of putting the widgets, data fields, etc. on the display in a visually pleasing way. They have a small selection of style sheets which were adequate. Of course, just as with QEDWiki, you can use your own style sheets.

Building ActiveGrid Web Pages is a Mixed Bag
I liked a lot of the page-building features. Selecting service fields to use on a page was quite easy. I didn't need to map the fields to page elements, ActiveGrid did all that for me. One very nice feature allowed me to take individual pages I created and, using AJAX techniques, quickly aggregate them into a single page. I liked that a lot.

Some things I didn't like about the page builder. What I want, and what most web-based application page building tools provide, is a visual modeling tool with a drag and drop interface allowing me quickly to mock-up the page. That's not how GUI development works in ActiveGrid. Setting up page formats reminded me strongly of the way we built XWindows/Motif interfaces back in the day. Trust me, it was tedious and time consuming.

BPEL for Page Orchestration and an Intuitive Event Model
ActiveGrid use BPEL for page orchestration, which is an interesting although unorthodox choice for an interactive application. BPEL is usually associated with the WS-* stack for web service orchestrations, not with human workflow. They've done a good job of making it work by unifying their service model. Everything is a service. A call to a RSS feed is equivalent to a SOAP call is equivalent to script is equivalent to a database call. It's an interesting approach and one that works well. I found their event model, which they use to control page flow, to be easy to understand and easy to use. They only have synchronous events, but that's a small price to pay for simplicity. BPEL supports asynchronous events, so perhaps they will them in the future.

For Mashups, Better Get Some Expert Help
Now let's talk about building mashups. ActiveGrid absolutely qualifies as a tool for building business mashups. They aggregate data and visual elements into a unified end-user experience within a process-centric framework. However, just because it is a business mashup tool, it doesn't follow that ActiveGrid is a tool for business mashers. In fact, I'd say that ActiveGrid requires quite a bit of technical savvy to create mashups. More savvy than we can expect from your average business analyst-turned-masher.

As I've hinted earlier, you can call many types of services from within ActiveGrid. A call to an RSS feed is equivalent to a SOAP call is equivalent to a database query. From my perspective as a business masher, I agree with this approach. Why should I care whether my service is REST, SOAP, RSS, POX or just a database call. Admittedly, I need to know the type of service when I import it, but after that it's just inputs and outputs.

But, and this is a huge 'but,' when it comes time to aggregate these services so they can be used to present a common interface, we come to a real problem area: Field mapping and parameter passing. I can only say that field mapping and parameter passing within ActiveGrid, well, sucks. It's likely not their fault. I don't know any application builder, especially one based on BPEL, that does a good job with field mapping and parameter passing. When I had to pass parameters into a service call, it took me quite a while to figure out how to do it. In fact, simply following the steps in their 'getting started' application to pass a parameter from one page to another was confusing and took me several tries before I understood what they were doing.

Let's just say that ActiveGrid didn't seem to take a lot of trouble to hide the complexity of the BPEL field mapping monstrosity from their end-users. No graphical drag and drop, no suggested mappings. Not even pop-up help so you could read the help and look at the interface at the same time. Just a lot of steps to define input variables, output variables and a really bad interface to define the logic that associates the two together.

Yuck.

Botom Line

Business mashers can use ActiveGrid to build web-based applications quickly and easily. With some understanding of how databases work, it's quite simple to build highly interactive and appealing applications. When you want to turn them into mashups, however, you'd better get some serious help from your applications development organization because it's neither quick nor easy.

Friday, November 2, 2007

This just in: mashups may be on their way from The Peak of Inflated Expectations to the Trough of Disillusionment

Andy Dornan published an interesting article Monday in Information Week and replicated to internet evolution yesterday. His article was interesting because it went into some detail about whether Web 2.0 techniques (I won’t call them technologies) are ready for widespread use in the business, and he backs it up with some survey data. This has been a popular subject lately, as demonstrated by postings by Hinchcliffe, Baer, and even myself here and here.

I've already responded to Dornan about whether mashups will ever get 'business chops,' (his wording) so I won’t write about it today. Instead, I’d like to talk about some of the findings from the Information Week survey data that were the basis of Dornan's article.

Information Week surveyed 110 respondents about various Web 2.0 issues. First, respondents were asked, “What is Web 2.0”, which is a good question with which to start. 15% said Web 2.0 was a waste of time and bandwidth and 53% said it was an overhyped buzzword. (Multiple selections allowed.) I’m going to assume that the 15% overlapped the 53% and conclude that 38% of respondents believe that while Web 2.0 is an overhyped buzzword, which is true, it isn't a complete waste of time, which is also true.

This is is progress.

In the last year the value of AJAX, wikis, YouTube, blogging and RSS has been demonstrated to business. While more people may be sick of the term, they are also adopting the technologies. Definitely progress. Regarding mashups in particular, after all this is a mashup blog, 54% put browser-based applications under the Web 2.0 umbrella. Of course it would have been better if they had been called by their proper name: presentation mashups. But a year ago I doubt if 54% of respondents would have understood the significance of browser-based applications, mashing at the glass, and how that is different from the old web-based integrations we've been using for years. Again, progress.

The article also says that 40% of respondents thought mashups were a Web 2.0 attraction. At least they thought so in February. The survey results weren’t explained in the article, but I was interested to see that in September the rate dropped to less than 10%. I'm not quite sure I'm reading the correct meaning behind the numbers. I'm assuming they floated two identical surveys. One in February and one in September. This makes sense if you correlate it with Gartner’s Hype Cycle for Portal Ecosystems, 2006. If Gartner’s predictions were correct, we were due for a plummeting hype rating for mashups sometime in 2007. Personally, I hope it’s true because I’m all for the mashup hype calming down so we can roll up our sleeves and get some real work done. I’ve posted enough on this subject that I’ll let it rest for now. While marketing departments want the hype, the rest of us would rather move on to the Plateau of Productivity as soon as possible.

One set of numbers that is a bit more sobering is that over 60% of those asked to cite objections to Web 2.0 said that security was an issue. The high percentages demonstrates that the respondents weren’t stupid. In my mind security is the one problem with SOA, various Web 2.0 technologies in general and mashups in particular, that could stop progress in its tracks. At least for a while. The first time sensitive personal information gets leaked out in a mashup, and the masher is a member of a company with lots of money, we are going to see some serious lawsuits and a lot of backpedaling on the mashup adoption front. It won't stop us forever, but it could stop us for a while if we don't start to take security very seriously.

I’d like to close with some thoughts on the Thompson Financial mashup example cited in the piece. Thompson has been building true business mashups before mashups were cool. In Thompson's view, the business building their own mashups doesn't bring them into conflict with IT. Each does what they do best. IT provides the secure infrastructure, the backups, the servers. Thompson's subject matter experts build the applications. It's definitely a great model for turning the idea of business mashups into the reality of business mashups.

While I've been concentrating on the mashup data from this article, but mashups are by no means all Dornan covers. He discusses Wikis, social networking and other Web 2.0 'issues.' The article is well worth a read, even for those of us on the downward slope from The Peak of Inflated Expectations to the Trough of Disillusionment.

Friday, October 26, 2007

QEDWiki is a great tool, but is it really for creating mashups?

Believe it or not, this post is a review of QEDWiki, the mashup development tool being promoted by IBM. QEDWiki’s current home is with alphaWorks, whose charter is to put new IBM technologies into the hands of developers. AlphaWorks helps IBM get products and ideas fleshed out before bringing them to market. IBM says that 40% of alphaWorks offerings end up in IBM products. The other very cool message is that IBM is willing to experiment and drop 60% of its alphaWorks research products by the wayside. Wouldn't you just love to have IBM's research budget?

AlphaWorks has many products available for download, but the one currently getting all the press in mashup circles is, of course, QEDWiki, part of IBM’s mashup starter kit.

Unsurprisingly, IBM claims QEDWiki uses a wiki metaphor to allow non-developers to build mashups. Just as with a wiki, end users can create pages, modify or copy pages if they have permissions, share pages and even put in links to jump from one page to another.

That's just what I did. It was very easy to create a page and drop in widgets. I tied the various page elements together so clicking on a selection in a feed changed the contents of a URL widget elsewhere on the page. I used a web service from StrikeIron to look up the city and state associated with a zip code, and even let QEDWiki build the form to ask for input. It certainly wouldn’t take a developer to understand and use QEDWiki. While usability still needs some work (there are odd refresh problems and widget placement issues) the tool itself seemed sound.

After I pulled in a number of feeds, added widgets, and constructed URLs based on widget contents I had very quickly created a visually appealing and functional…portal.

Yep, I said portal.

Here’s the problem. When I wanted to bring different data streams together, I had to do it elsewhere, not within QEDWiki. So I found myself running to Yahoo! or Kapow or even del.icio.us to create feeds. (OK, the del.icio.us feeds weren't, strictly speaking, mashed) Once I mashed the feeds I could drop them onto my wiki page, but I couldn’t use QEDWiki itself to mash the contents.

Here’s another problem. When I wanted to pull data from different web pages into a single unified end-user experience, I had to run over to Kapow to create the web service, then go back over to QEDWiki to use the service within my wiki page.

You might conclude from what I’ve said so far that I didn’t like QEDWiki. On the contrary. I liked it very much. Other than the aforementioned usability issues, it was very easy to get started and definitely was not a tool that required development know-how. Once I had mashed content elsewhere I loved being able to drop elements on the page and link them together. It was great to pull search results based on a dissected URL parameter list. It was a no-brainer to click on a feed item and have another widget populated with the data served up from the item's link. I don’t pretend to have tried every single feature, but the features I did use were excellent.

It just wasn’t a mashup. QEDWiki consumes mashed content. It displays mashed content. It does not itself help mashers create mashed content.

That looks like it is the responsibility of Mashup Hub. Mashup Hub is the other part of IBM’s mashup starter kit, along with QEDWiki. I’ve queued it up for review some time in the future, but just looking at the specs, it looks like a server to transform behind-the-firewall data into feeds. While that’s great functionality, it’s hardly new or innovative. Unless I hear otherwise, I'm going to drop it at the end of my eval queue.

Go ahead and give QEDWiki a try. I'm betting you will enjoy working with it. Don’t get rid of your other mashup tools just yet, however. You will definitely need them.

Monday, October 22, 2007

How close are we to overcoming the 10 challenges facing business mashups?

Once again I’m going to delay my QEDWiki review. Really, I’m going to get to it. Honestly. However, I decided that I needed to discuss overcoming the challenges presented by Dion Hinchcliffe in his post last week, The top 10 challenges facing enterprise mashups. Simply taking a futurist approach as I did in my last post didn't seem like it would be enough.

Hinchcliffe’s post generated a lot of discussions within both the mashup community, and within Serena Software specifically. A number of us debated his points, discussed whether we could help overcome the challenges and even used his post to guide discussions on features we plan to put in our future product releases. I guess this makes Hinchcliffe an honorary Serena Product Manager. Thanks! And thanks to my many colleagues at Serena Software whose ideas have been integrated into this post.

Hinchliffe’s ten challenges fall into three broad categories: business challenges, governance challenges and technical challenges. Rather than addressing each of his ten issues, I’ll address the categories.

Business challenges: Lack of business support for mashups and lack of killer mashup applications.

Remember when the web started to grow? At first it was full of sites with pretty pictures and cool graphics. Organizations created websites as experiments or as another avenue for advertising. It wasn’t until the web killer app came along, eCommerce if you were wondering, that we had the dot-com explosion. We can repeat this story with SaaS and Salesforce.com. When the business sees a killer app, the business wants the killer app. Once we find the Salesforce.com equivalent for mashups, we’ll have the business lining up to invest.

Why hasn’t this happened yet?

Because we’re too busy talking about how cool mashups are. While cool is cool, it isn’t a killer app until it solves a business problem. We can take maps, charts and videos, we can pull in data from multiple sources and we can mash them together at the glass into a visually exciting experience for the mashup user, but no matter how cool it is, it won’t be a killer app until it’s scalable and useful. The problem with at-the-glass mashups is they don’t put the mashup in the context of a business activity. Yes, it’s great that I can pull data from many sources, but if the data aren’t actionable, what’s the point? If I can’t reuse the business logic across the organization, then why invest?

Let’s use an example. Assume I run a fleet of ice cream trucks and I want to make the best use of the trucks. I could use a presentation or data mashup to help by pulling local event information from online community calendars, school activity calendars, business announcements and even law enforcement announcements. I could map these events on a Google Map along with information about the likely size and times of the events. Using this information I could develop a schedule to optimize the routes of my trucks.

That’s a nice way to use mashups, but it isn’t a killer app. It’s not even a business mashup. It’s a data mashup with some cool graphics. A killer app would take the information from the mashup and use it automatically to schedule trucks, drivers and inventory to make sure the right trucks were at the right locations with the right inventory at the right time. The killer app would keep updating event information. A killer app would know when trucks are due for maintenance and schedule the maintenance around heavy usage days based on the mashed-up information. Our truck scheduling application is a business mashup because it puts the mashed up information in the context of the larger business problem, namely, optimizing ice cream truck utilization. The data aren’t enough. The data must be actionable and solve an actual business problem.

Once we understand that a killer mashup app has to be in the context of a business activity, that the mashup data has to be actionable, and that the mashup itself must solve business problems, then we will start to see a lot more businesses take mashups seriously. Until then, well we can always console ourselves that we are cool.

Governance Challenges: An immature services landscape, confusion over management and support of end-user mashups, chaotic data quality and accuracy and version management.

I’ve written about this issue before, both in my futurist post about the sematic web, and earlier when discussing the role IT can play as a trusted advisor to the business with respect to business mashups. Some discussions bear repeating, however, so I’ll cover some of the same ground again.

Lack of mashable content and data quality are interrelated. Without supported services tied to systems of record, mashers will have a difficult time ensuring the quality of their data. Long-term I believe this is a problem for the semantic web. Short-term, however, vendors need to start getting serious about enabling access to products through web services. At Serena we’ve already started this process, and we will continue to add services for the foreseeable future. As mashups become more accepted in the business community instead of just an IT tool I expect we will see this trend emerge with other software vendors. Note to business mashers: If you want your vendors to provide web services, you’d better start demanding them.

Management and support of mashups will be problematic and will get worse as more mashups are developed by the business community rather than IT. When talking to IT professionals about mashups developed by the business, this issue is where IT has the most heartburn. As Hinchcliffe notes, once upon a time this same scenario played itself out with PCs, databases and spreadsheets. The business started something, building applications, that it couldn’t support long-term and IT was tasked with providing support for applications about which they knew very little. IT has a long memory. I doubt if they will be taken by surprise again.

Surprised or not, IT isn’t going to be able to stop business mashers from developing mashups. Not only does the business have too much at stake, but the new generation entering the workforce doesn’t have a lot of patience with corporate hierarchies. They’ve grown up with technology and won’t wait around for IT to build their applications. To stay relevant, IT needs to become the partner of business and provide a secure and scalable infrastructure in which the business can build mashups.

It is inevitable, however that the business will eventually need support for their mashups. We could see a move towards centralization once more, just as we did when the business handed back all those Access databases to IT. However, business has a memory just as long as IT, and they will remember that while centralization did bring order to the mish-mash of rogue applications, the cost was business agility and strict IT control. I suspect that many on the business side of the house will look for an alternative.

Enter a new breed of vendor whose business will be to support the business. Budget oversight being what it is, these new vendors will likely provide support as part of a subscription process within a SaaS model. These vendors will need to fly under the capital expense radar and simply be a line-item on a department’s monthly expenses, similar to a cell phone bill. That means many business mashups will be purchased as part of a subscription model with support being provided by these new vendors. That way the business can build their mashups, but can also have a number to call when they need help.

I agree with Hinchcliffe that mashup version management has to be part of any mashup tool vendor’s offering. Lucky for Serena we’ve already got mashup version control as part of our mashup tools.

There is another version control issue that needs to be confronted, however. Version control of the individual services has long been a problem within SOA implementations. It’s a dark not-so-secret that uncontrolled services can cause disaster in SOA-based applications. If the SOA implementation has a successful reuse policy, the problem is even worse since a single bad service can bring down any number of applications. And yet there is no way for the SOA client to know whether a service has changed. Here vendors and 3rd party web service vendors need to be held accountable by consumers. Until that time, version control will continue to be a challenge.

Technical Challenges: No construction standards, the splintering of widgets, deep support for security and identity, and low-level mashup support by major software firms.

I’m bullish about overcoming the technical cited by Hinchcliffe. If we can get the business to throw their weight behind mashups, the vendors will have tremendous pressure to start providing some solutions that will make it easy for the business to adopt the business mashup model.

However, I’d like to challenge Hinchcliffe’s assertion that we need a unified method for mashup construction. Ditto for widget technology. It would be great if all the tools had a consistent approch, but I’m not sure I’d classify it as a challenge for mashup adoption in the enterprise.

Business mashers will have domain knowledge and a level of technical competence consistent with building Excel spreadsheet macros. Given that business mashups need to mash data and visual elements in the context of a business activity, it’s clear that model-based construction is the solution with legs. Our business mashers won’t be writing JavaScript. They won’t be writing any sort of code, even if that code is disguised as an XML document. They will be dragging and dropping visual, data and process elements using a familiar office-like interface. If that’s the case, the end user won’t care what is happening under the hood. A consistent method of construction may be a challenge for the vendors, but not for the mashers.

I do agree with Hinchcliffe that support for mashups among infrastructure and application vendors will continue to be an issue for some time. However, we might be able to solve some of the problems in the short-term. For example, if we are to put mashed content in the context of a business activity, we must have some sort of event driven architecture, or at the very least, a simple eventing system. Every vendor has one. Even Serena has one. We use the eventing system within the open source Eclipse ALF project. Eventing systems require participating software to kick off some external communication when important things happen.

Let’s consider our ice cream truck example. Ideally, the mashup would need an event to kick-off rescheduling truck routes when a concert gets cancelled, a new truck is purchased or a driver quits. the ALF project has tried to make the eventing system generic by providing web services to raise events, but again, the web services have to be tied to custom actions within the ALF event management system. While the pattern is the same for other vendors’ eventing systems, the devil is in the details.

One way to overcome this is to use eventing systems that already exist. Email leaps to mind, as do Outlook meeting reminders. Many back-end systems already know how to send emails and already integrate with outlook. While it may not be the best of all possible worlds, it would certainly jump-start event-oriented business mashups if the onus was on the mashup tool vendors to integrate with these existing event channels.

As for other low-level support, once again Hinchcliffe has it right. We can solve some of the issues, but the bulk have to wait until software vendors feel the squeeze from customers demanding low level mashup support.

I’ve saved the hardest problem for last: security. If anything is going to kill SOA and the companion consumption-side technologies, it will be security and identity management. Consider web-based applications. We’ve been at those for over ten years, and we still don’t have security under control. With SOA the problem is even worse because there are myriad potential back-end systems engaged in every mashup, and to date the most common method of passing around credentials is either as a parameter to service calls, or in the service header. With RESTful services the problem is aggrevated since the WS_* standards generally don’t apply at all.

One promising solution is the open source Eclipse Higgins identity management project. Many vendors have already signed up to use Higgins, but again, until all vendors adopt the standard, we are going to have the potential for serious security breaches within mashups. Especially mashups at the glass.

My conclusion is that yes, we have some challenges, but in many cases these challenges are either already in the works to be solved, or there is at least a roadmap for solving them. The ones that aren’t going to be overcome in the short-term will be side-stepped. How? I don't know. I do know that once the business understands the potential of business mashups, nothing will get in the way of widespread adoption.

Thursday, October 18, 2007

Can the semantic web help with the ten challenges facing enterprise mashups?

I'm going to delay my review of QEDWiki yet again to comment on Dion Hinchcliffe's post, The 10 top challenges facing enterprise mashups. Hinchcliffe's blogs about Web 2.0 have been very influential over the past few years, and this excellent posting is no exception.

Fair warning: I’m going to use his post as an excuse to go off on a futurist binge and talk about the semantic web. Don't worry, though. I'm going to talk about the 'real' semantic web, not the ivory tower version.

I won't reiterate Hinchcliffe’s points, you can, and should, read them for yourselves. However, I do want to talk further about two of his challenges that I think are related, and relate directly to the power of the emerging semantic web. His #2 challenge is an immature services landscape. There just aren't enough services out there to provide mashable content. His #6 challenge relates to data quality and accuracy. How do mashers know whether the data are accurate and up-to-date?

I see these issues as interrelated. The lack of 'supported' services is driving people to create services for themselves using various tools, HTML screen scraping being the one I've been working with lately. Before you dismiss screen scraping as a viable content creation strategy, note that the number of robots available from OpenKapow outstrips the number of services from StrikeIron and the number of APIs available from Programmable Web combined. Lack of services is causing people to turn to self-help methods to get mashable content directly from web pages. Yet we all know that web pages often have out-of-date data or even absolutely trash data.

Do you know about The Greys? The Greys are a crossbreed between human and an extraterrestrial reptilian species. By visiting this site I learned that there are over 70 distinct species of Greys. Wow! Good thing I have this website around to help me find such valuable information.

‘The Greys’ is an extreme example, but there are others that are less silly. If you were scraping content from the US Open site about who was in the women's final you would get one set of names for 2006 and another for 2007. Yet once the data is abstracted through a service call and incorporated into a mashup, it won't be obvious the 2006 data is out of date. Mashup user won't, and shouldn't, be able to tell from where the data came. Mashups are first and foremost about presenting a unified experience to the mashup user. Noting where data comes from makes the mashup less of a mashup and more like a plain old integration.

How can mashers solve this problem? One way is to create more supported services so mashers will depend less on tactics such as screen scraping to get their mashup content. I doubt this will work. By some estimates there are between 19 and 30 billion web pages today, and that doesn't even count dynamic pages such as search results from the Snap-on Tools site. We aren't going to create web services to expose reliable data for all of those pages. People who need mashable content are going to get it where they can, and that means web pages themselves.

Another way to help with the data reliability problem, and this is where I think the web is going, is to start leveraging the capabilities of the semantic web. I’m talking about the practical semantic web that is emerging from the likes of del.icio.us, Facebook and Amazon. I’m not talking about the ivory tower semantic web with volumes of ontologies, deductive rules and AI searches. Some call this emerging web “Web 3.0” and some say the ivory tower version of the semantic web is “Web 3.0.” Personally, I don’t care what we call it, but I’m excited about what it is, or rather, what it can become.

To backtrack, the semantic web is a way of structuring web content so that it can be consumed both by humans and by machines. Most web content today is only consumable by humans. (Irony. It's everywhere.) That’s why we get so many trash results even from the greatest search engines. In the ivory tower version, every web page has both semantic information (what the information on the page means) as well as content. The semantic information makes the page machine consumable. A phone number is a phone number is a phone number. Once a program knows the content is a phone number, it knows how to handle said content.

In theory, but not in reality, since there are many ways to tag and format a phone number.

In practical terms today, web content is being slowly categorized by various tag clouds such as del.icio.us, social network sites and blogging sites such as the one you’re visiting now.

Today these clouds are disaggregated without any sort of consistency. However, while it is unlikely we will get universal acceptance on what amounts to a tag dictionary, it is highly likely we can get universal acceptance of a small number of tags. This has already happened in specialty areas such as research libraries. Imagine a rating tag being adopted by all tag clouds so site visitors can rate the quality of a web page ala Digg, or an expiration date so mashers know when content is out of date, or even a copyright tag telling mashers the page is off limits for scraping. Not that mashers would pay attention.

Imagine a world where a masher pulling content from a web page through HTML harvesting of some sort could be given a rating of how good the data is likely to be. And even with disparate tag clouds, it would be possible for mashing tools to suggest alternative content pages. Imagine a world where the mashup itself could warn users if content quality degrades below some acceptable level.

Finally mashers and mashup users would be able to have some indication whether they are getting the latest scores, the most reliable news or the best information on extraterrestrial species.

OK, this is all for the future, but perhaps the not-to-distant future.

I’ll see what I can do to convince Serena Software to start thinking about the semantic web and how we can use it to help business mashers. Meanwhile, go give Hinchcliffe's blog post a thumb's up vote.

Tuesday, October 16, 2007

Is BPM another form of business mashup?

I was going to write a review of QEDWiki today, but IBM’s recent announcement about their starter kit has made me decide to leave it until I can do some more investigation. Given the flood of articles and blog posts about IBM’s announcement, I’m sure nobody will miss that I’m going to post on something else.

Specifically, what does the BPM community think about mashups?

BPM and SOA have been joined at the hip for several years. With SOA, business processes could be deconstructed into participating services which could be re-assembled like LEGO® bricks and executed in a production environment. I believe, although many BPM advocates would disagree, that service orientation saved BPM as a discipline. Once an organization had SOA, BPM could be used to solve myriad business problems rather than just model them.

Given the BPM-SOA link, I was very interested to read this article published on BPM Institute’s site titled SOA And Mashups – What to use when by Dr. Raj Ramesh, a BPM and SOA implementation specialist. Interestingly and surprisingly, Ramesh believes that mashups and SOA are distinct techniques for solving business problems. He says, “The question then is whether IT should use SOA or should it use mashups?” Ramesh goes on to conclude that IT should use mashups if the application is a one-off, and SOA, with BPM as the consumer, if services have the potential for reuse.

Why was I surprised? SOA versus mashups is an apples-to-oranges comparison. He should have compared BPM and mashups instead. BPM, presentation mashups and data mashups are all SOA consumers. You might want to read Rich Seeley’s article on SOA consumption patterns for a quick overview of how mashups and BPM are related. Service orientation enables mashups, it doesn't compete with them.

In fact, I would argue that BPM based on SOA is itself a form of business mashup. BPM is process-centric, pulls together content from many sources and presents a unified view of the content to the process participants. Isn’t that a business mashup?

Earlier in the piece Ramesh said that, “SOA provides a methodical paradigm for a robust long-term architecture based implementation. Contrast these to mashups that are easier to develop but are a challenge to manage due to the immaturity of the tools.”

Clearly BPM practitioners don't yet see the relationship between BPM and business mashups. Too bad. With a little effort aimed at making their tools easier to deploy, BPM vendors could already be sitting on some pretty mature business mashup tools.

Tuesday, October 9, 2007

IT should be an almost-silent partner of the business masher

I'm taking a break reviewing QEDWiki to comment on Tony Baer's excellent posting about the history of application innovation and the mashup. With mashups, he says, app dev innovation history is repeating itself. I couldn't agree more. Sometimes we forget lessons of the past in the rush to embrace the new and innovative.

Did I say "sometimes?"

We always do. And that's actually OK because behind every innovation there are a bunch of insiders who declare that it can't be done, shouldn't be done, or has already been done and failed. These people have an important place in the innovation loop, providing valuable negative feedback that should stop out-of-control oscillation. As long as the negative feedback doesn't overwhelm the positive motion of innovation, it's useful. Without collective amnesia, however, negative feedback can stop innovation in its tracks.

If collective amnesia is necessary and not at all evil, that may be why so much innovation has happened outside of IT. IT can't afford to have collective amnesia. They are directly responsible for keeping the CEO and CFO out of jail, making sure books get closed, people get paid, products delivered and customers billed. Radical innovation in that environment carries unacceptably high risk. That's why laptops infiltrated the workplace with new light-weight applications long before IT caught on. Web applications came out of marketing departments and mashups, well they are oozing out of everywhere. Everywhere except IT that is.

Baer's point, however, is that after the rush to innovate, mundane concerns eventually do set in. Once you've built a mashup, once the enterprise has started to depend on the mashup, how will the data be kept accurate? Who will maintain it? When it starts to be used in ways that the original masher didn't intend, who will support it? These are the everyday concerns that are in the DNA of professional coders in IT. When Computer Science professionals start to build an application, they think about these issues up-front.

Not so for the innovative business masher. He or she can build an application quickly, deploy it easily and watch it grow virally throughout the organization. Right up to the time the CEO gets bad information because the mashup referenced old data.

Crash!

So am I saying that business mashers need to stop innovating? Regular readers of this blog will know that I wouldn't say anything like that. Business developers are going to drive innovation further and faster than IT exactly because they aren't hampered by such mundane considerations. They need to keep moving forward, but they need help from IT even if they don't know it or are unwilling to admit it.

What sort of help? IT can give good advice about the best way to access data so it remains up-to-date. IT should set up a secure infrastructure so the business masher doesn’t inadvertently release sensitive data into the world. IT can also give some gentle technical advice about best practices for assembling applications. They can help mashers set up a very light-weight process for getting feedback (AKA bugs or enhancement requests.) and help them work with version control before disaster strikes.

Before you in IT decide this isn’t your problem, here's the money quote from Baer: “At the end of the day, mashups that evolve to enterprise mashups are not like enterprise applications. They are enterprise apps. The only difference is that they piece together much much faster.”

Again, I couldn't agree more.

Friday, October 5, 2007

Kapow's RoboMaker has lots of features, but needs a usability study

I've been working with Kapow's RoboMaker for the past few days, investigating it for ease of use with the business masher in mind. RoboMaker is a desktop tool that enables mashers to bring content together from many sources to create RSS feeds, RESTful web services and even portal content.

From a pure 'number of features' measure, it is hands-down the most capable product I've looked at so far. Once I figured out how to use it, I was able to create an RSS feed from a news article list on Serena's website in very short order. In fact, I used the same basic steps to build both a feed and a RESTful web service.

Kapow's approach is similar to that taken by Intel's Mash Maker. (Or rather, since Kapow's offering has been out for quite a while, Intel Mash Maker's approach is similar to Kapow's.) That is, a masher identifies the content to be mashed by pulling a web page into RoboMaker. the masher uses html tag paths to tell RoboMaker where to extract the content, and depending on the output destination, how to format the output.

RoboMaker has a number of additional features, such as the ability to branch, call out to services, simulate mouse clicks, execute javascript, etc. This is both a blessing and a curse. A blessing because the richness of features allows mashers more flexibility when developing content. A curse because without solid and user-friendly documentation, the extra features can be difficult to find and use.

I didn't see any evidence of a WADL, which isn't surprising considering the industry hasn't yet decided whether RESTful web services actually need a WSDL-like contract. Since it will be mighty difficult or a non-technical user to pull a service into a process-based orchestration engine without a contract, I hope we do embrace either WADL, or some other contract standard. Until then, I think Kapow should be proactive and publish both the service and the WADL contract. I know, that's easy for me to say since it isn't my R&D budget.

The resulting feed, service or portal content, as specified by the robot (in Kapow-speak) is stored 'in the cloud' on Kapow's server. Kapow also supports OpenKapow, a user community where mashers can publish, share, search for and discuss robots and mashups. (Note: Kapow also has an on-premise edition that I didn't test drive. I can certainly see how the on-premise edition would be very useful for organizations trying to leverage the valuable data that lies trapped behind the firewall.)

So in most ways, except for the sheer number of features, RoboMaker is a lot like Mash Maker. Since they both use HTML scraping, that isn't a surprise. Unlike Intel, however, RoboMaker robots produce mashable content that is based on standards, so you can create the mashup content using RoboMaker and use any mashup tool to build the mashup itself.

That is mighty cool.

But is it ready for a business user?

Back in my Unix days we used to say that the Unix shell was 'expert friendly.' That is, once you knew the ropes, you could use the shell to do an amazing number of things. That's the way I feel about RoboMaker. I found the documentation circular and confusing, and even more infuriating, error messages that were unhelpful and in many cases actually cryptic. However, once I found the rough edges and understood what was going on, I was able to do a lot of mashing very quickly.

Back to the question. Depsite the need for better documentation and a usability development cycle, RoboMaker is the first mashup tool, besides Serena Mashup Composer, of course, that I believe I could put in front of a business masher.

Tuesday, October 2, 2007

Business Mashups roll out in India

From ITVIDYA.com: The Chicago-based [sic] software applications company -- Serena Software has announced the availability of Serena Mashup Composer in the Indian market for the point-and-click creation of business mashups. Just as Web 2.0 technologies have made it possible for millions to create custom Web applications using open APIs and tools from various companies.

Well first, Serena is based in San Mateo, CA, not Chicago. That aside, this is a great article that accurately conveys Serena's business plan, both in the US and in India.

The concept of a 'business mashup' is fairly new, and not well understood today. For many people, process-centric mashups are akin to what you can build with Kapow's RoboMaker. In RoboMaker, mashup developers can use some process capabilities such as branching, to build an RSS feed or a REST-based web service. This is not what Serena means by process-centric mashups, however.

For Serena, a business mashup is much like a business process. That is, the process is long-running with a persisted state, spans functional silos and involves multiple stakeholders. A business mashup is also like a consumer and data mashup, with content pulled in from multiple sources, both within and outside of the organization. With Mashup Composer, business users can build these mashups without having to get IT involved.

I suggest you read the article and, of course, try Mashup Composer.

read more | digg story

Thursday, September 27, 2007

Intel Mash Maker shows promise, but isn't there yet

I've spent the last couple of days playing around with Intel's Mash Maker, trying to see if it is a tool I would feel comfortable putting in front of a business masher. As with so many other mahsup tools, the answer is "No." Not only is it much too fragile at the moment (Their server has been down more than it has been up over the past couple of days.) but it still requires users to know HTML and XPath to scrape content.

However, unlike Google's Mashup Editor and Yahoo! Pipes, I can see the skeleton of a real business-user tool in the making.

For those of you who may not have read about Mash Maker, it is a semantic web-based HTML screen scraper developed at Intel's Berkley Research Lab. The idea behind the tool is to leverage the power of everyone in the world to classify, tag and otherwise bring under control the chaos of content that is now the web. Then when the content is tagged and understood, allow said content to be mashed with other content while providing hints about other available mashups.

Here is an example. There is nothing new about plotting all the good burrito restaurants near a specific location. We've been mashing up those sorts of sites for a couple of years now. However, what if you wanted to find the good burrito restaurants that were within walking distance of a tire repair shop? If you had to use Google or Yahoo! to develop this mashup, it wouldn't be worth your time. However, with Mash Maker you can create the mashup in minutes.

Run a search for burrito joints in your area code. Show the results as a table (A 'standard' mashup provided by Mash Maker) and copy them. Run a search for tire repair shops in your area. Paste in the burrito restaurant information and map the results. Now you have plotted on one map the burrito joints and the tire repair shops. It's faster to build the mashup than it would be to run two Google Map searches and tally the results on paper.

Why do I say the tool isn't yet ready for prime-time? Server crashing problems aside, getting to the point where Mash Maker understands the content in a web page is difficult. Any page that Mash Maker doesn't understand, and that includes nearly every website that I visited, needs to have a data extractor written to scrape out the content. Creating a data extractor requires the user to know how to write XPath expressions, and how to understand the relationship between the page's HTML and the rendered content. Sorry, but that's too much to ask of a business user. No, not because the business user is stupid, but because these users are experts in their subject area, not in HTML and XPath.

I have to conclude that "Mashups for the masses" is still only a pipe dream. However, I think Intel is on the right track. While Mash Maker isn't ready for my business mashers yet, it could be in the future.

Friday, September 21, 2007

Google's mashup editor is still for developers

I've been playing around with Google's Mashup Editor, trying to see if it is a reasonable tool for business mashers. It didn't take me long to know the answer: No, it isn't.

It is certainly an improvement over the old way of building Google Map mashups. That involved writing javaScript inside HTML pages and deploying to a handy web server. This wasn't a problem for me because I've been coding for many years, and I happen to have a web server on my machine. A business masher, someone with subject-matter expertise but not deep technical knowledge, wouldn't want to write the HTML or javaScript, and likely wouldn't have a web server on hand to which they have deployment access.

Is Mashup Editor any better than than browser-side mashing? In some ways yes, in many ways no. Google allows mashers to deploy mashups to Google's server, which is definitely an improvement for business mashers. Deploying to the cloud is likely the only way a business analyst will get his or her mashup published. IT isn't going to be able to spend time deploying mashups to internal servers. Certainly not when the mashups will probably need to be deployed several times before they are 'right.'

So Google is definitely on the right track, but Mashup Editor is still for developers, not for business mashers. Mashup Editor requires mashers to write code. It doesn't matter that the 'code' is in the form of an XML document. It is still code. For me it is a great tool because I can include feeds as easily as maps within the mashup. No more jumping between Yahoo pipes and Google Maps. I can also use existing HTML, including style sheets, and add data and map elements from Google's expanded tag list.

I would never, however, put Mashup Editor in front of a business analyst to build business mashups. Even though there is no longer a need to write javaScript for browser-side content mashing, mashers still have to use HTML, and XML tags are still programming.

If you want to combine visual and data elements in a mashup, you are comfortable with HTML, style sheets and XML tags and you don't care about process, go ahead and use Mashup Editor.

Is it a tool for business mashers?

Nope.

Wednesday, September 19, 2007

Where will we get business mashup data? Not from Intel's MashMaker.

Business mashups promise to put the means of application production into the hands of subject-matter experts. With the right tools and some know-how, anyone can create a process-centric application that mashes-up data from several sources, all without bothering IT.

Here's a problem with that scenario: Where will we get the services? Creating and deploying web services to use in mashups requires technical skills generally found in IT. Will business mashers stand up SOA infrastructures? Will they code up some SOAP or REST-based web services to deliver the content they need to mash?

Unlikely, and this will become an increasing problem. Business mashers need a way to create these services without the help of IT. It isn't that IT doesn't want to help, but IT is simply too busy keeping the mail server running. Business requests for services will end up in the seemingly bottomless backlog of IT development requests.

It doesn't have to be that way, however. Intel, an unlikely entrant into the consumer mashup business, has announced a program called MashMaker that uses HTML screen scraping to let users mash any web page. Well, I suspect not just any old page. I expect that pages using Flash may pose some problems.

Of course, this is going to cause a fire-storm of controversy. Who owns data on a website and is it legal to pull it into mashups? You can be sure that industries relying on copyrights (music, newspapers, books...) are going to have a hissy-fit. The courts will have to figure that out.

Regardless of the copyright issues, Intel is on the right track. They want to combine del.icio.us-like tags with Amazon-like recommendations to suggest mashup ideas to users. (People who mashed this page also mashed this other page...) That's a great idea, but I'm concerned about Intel's approach. Sure you can mash-up any web-page, but you have to use Intel's mashup tools to do it. Intel seems to be going down the proprietary path, and that's a shame. We've worked long and hard in the software industry and finally have some standards available for constructing loosely-coupled applications. They may not be perfect, but they work.

Intel's MashupMaker isn't set up to work well with others. You can't deconstruct a page and access the data you've extracted in, say, a BPEL service flow or from a BPM engine. MashMaker would be much more powerful if it generated a call-able web service that could be consumed by any standards-based loosely-coupled application.

I've signed up to beta MashMaker, so I'll let you know more when I get a feel for what it can do. Meanwhile, it's a step in the right direction at the very least.

Monday, September 17, 2007

Do we need business mashups?

I've been talking quite a bit about business mashups lately, and besides the standard, "What is a mashup?" and "Why did you have to call them that?" I've been asked whether we need business mashups. Aren't the mashups we already have good enough?

That's a good question, but to answer it I need to explain what other sorts of mashups are out there in the world. For simplicity's sake I will only talk about software mashups, not music mashups, video mashups or any other sort of mashup.

There are several types of software mashups, the most common being the consumer mashup, best exemplified by the many GoogleMap applications. GoogleMap applications range from calculating the distance of a lunchtime run to showing which hotels the stars frequent in Beverly Hills. Just to play around, I even created one myself, showing the good burrito joints near my office. If you're interested, you can take a look at many consumer mashups by visiting Programmable Web.

Other common types of mashup are “data mashups” and “enterprise mashups.” A data mashup combines multiple data sources into a new data source, such as combining the data from multiple RSS feeds into a single feed with a graphical front end. An enterprise mashup usually integrates data from internal and external sources. For example, it could create a market share report by combining an external list of all houses sold in the last week with internal data about which houses one agency sold.

Consumer and data mashups aren’t enough if you are trying to solve business problems. These problems generally have several things in common. They involve multiple stakeholders, they cross organizational boundaries, they interface with multiple back-end systems, and at the heart of every business problem is a process: a process for provisioning new employees, approving expense reports, or approving sales discounts.

Consumer and data mashups don’t provide the necessary capabilities to solve business problems. Business Mashups offer not only a unified experience, like consumer mashups, but have at their core a powerful process engine. This process engine lets you cross organizational boundaries and pull together stakeholders throughout your business. You get the right information to the right people at the right time so they can get the job done. No consumer or data mashup can do that.

So do we need another sort of mashup?

Oh, yes, we do.