Sunday, March 19, 2006

 

RPC Engineering?

At InfoWorld’s SOA Executive Forum in San Francisco this past week Jon Udell moderated a panel about different communication methods used by services. The panelists discussed the range of options from coarse-grained transfer of complete business documents – orders, invoices, and the like – to remote procedure calls with small sets of data. You can make arguments against big document exchanges on the basis of communication efficiency, but you can also make arguments against lots of small information exchanges because of the extra overhead needed to maintain state while all the little exchanges are carried out.

But the best argument for coarse document exchanges is that if you go that way you’ll be making a conscious design choice and almost certainly have invested some effort into designing the document models or evaluating standard document ones. The documents you exchange will more likely be ones that are easy to process because they’ll have unambiguous semantics and use robust code sets and identifiers. They’ll be easier to reuse across a range of related partners and services.

This isn’t to say that fine-grained information exchanges can’t also be well-designed with interoperable semantics. But many proprietary APIs are getting turned into web services by tools that do little more than slap angle brackets around "get" and "set" methods, and you often get what you pay for when you adopt these low-cost “automatic” design techniques.

Of course I’m biased. My book is called Document Engineering, not RPC Engineering.

-Bob Glushko


Comments:
Hey Bob,

Have u a feed? I cannot see it :-/

Sean McGrath
seanmcgrath.blogspot.com
 
we are currently replacing an orchestrated service of > 50 calls with a single document. It's been a bear to support the granular interface, it's difficult to trace and understand, and it's hard to switch/add endpoints.

so, we're going to a document. it'll be hard work but it's the right decision.
 
Ultimately I think we need both styles - although the reason is more practical than technical.

Software developers (and their tools) are more in tune with calling methods with parameters, whilst business and data analysts are comfortable seeing the "documents" being passed around systems. I'm currently consulting for an organisation who have document requirements that are sourced from multiple systems, with multiple development teams "owning" those systems. When it comes to providing an interface to those systems (to retrieve the required data) the interface is limited to that system. In SOA, this is a data service and is provided in RPC style. The (consolidated) document that is required is formed from a business service, using a process which calls multiple data services from different systems. Either way, to make all this work, agreed data semantics and processes are key and data quality must be in place to "clean the dirty water".

I guess the question is, when we refer to system to system interfaces, does it really matter? When the data is presented back to a user it has an applied context, which renders it information. When a web service consumes a request, it is a technical system issue to ensure that the agreed semantics are built in to processing that request and the data passed to it.

I have found that the challenge with web services (as with component-based software development) is to provide an interface which suits multiple clients, all with potentially different contexts for the use of the same data.

Bob: I still intend reading your book, so apologise if any terms are used differently than you have defined.
 
Time is money because the old saying goes, and by reading this post, I noticed that I saved myself numerous valuable time, which would have been otherwise spent on reading low consistency data all around the almighty web. Thank you for the straight to the point, helpful enter!
 
Post a Comment

Links to this post:

Create a Link



<< Home

This page is powered by Blogger. Isn't yours?