code-u-like

C#, .Net, Sql Server, Salesforce, Dynamics CRM etc,
Charity and Not-for-Profit sector

Monday, January 28, 2008

The Internet in 1995

Recently I was clearing through a box of old stuff, and I found a copy of ".Net - The Internet Magazine" from March 1995. Seeing the magazine again reminded when I bought it - I was in my third year at University, and the only practical way I had to get onto the net was to use the Sun Sparcstations in the university lab. From what I remember of the internet in 1995, most of it was about Monty Python or Star Trek. What I found in the old magazine pretty much supported that memory.

Some interesting snippets from this important historical document include:

An article with the headline "Microsoft's on-line plan still waiting - The Microsoft Network is raring to go, but Windows 95 is delayed, yet again.". The article explains that Microsoft expects their on-line commercial services such as MSN to become 'as popular as TV'. MSN is planned to be an AOL-type network with proprietary content not available on the main internet. Meanwhile, the shipping date for Windows 95 has been put back for a third time, to August 1995.

Another article with headline: "Windows NT: the easy option", explains that Windows NT 3.5 comes complete with FTP, gopher and Web server software. However IIS wasn't around yet (it first appeared as an add-on for NT 3.51, a few months later). The internet server software for NT 3.5 was freeware that had been developed by the European Microsoft Windows NT Academic Centre for universities and colleges that run Windows NT servers.

On the cover the magazine estimated that 23.5 million people were 'netsurfing' daily in March 1995. Although inside the magazine, William Poel points out that "The last time I looked, combined sales of Internet magazines in the UK were running at about 5 times the number of actual dial-up accounts". So in 1995 people seemed to be spending more time reading about the internet than actually getting on to it. In January 2008, the estimate of the number of people regularly online was 357.9 million, although they no longer call the metric 'people netsurfing daily', they call it the 'Active Digital Media Universe'.

Magazine writer Daniel Dern discussed the difficulty users have finding their way around the web:
"Most of the "making the internet easier to use" announcements we see are for front-end software programs, which are much easier to produce than the intelligent searching agents we all really need. In the meantime, searching the internet remains slightly less difficult than decoding a restaurant menu in a foreign language - with the best way to surf being to visit Home Pages and hotlists that other people have gone to the trouble of setting up, such as the justifiably popular Yahoo at http://akebono.stanford.edu/yahoo/."
At that time, Yahoo was a large collection of categorised links - it was already very popular and would start its transformation into a major web portal in the following years. The research project that led to Google would not begin until January 1996, and would initially also have a Stanford URL: google.stanford.edu. The domain google.com was eventually registered on 15th September 1997.

The magazine had a go at reviewing the popular browsers of the day, but this was a largely academic exercise, as Netscape Navigator 1.0 had been released a few months earlier and was way ahead of everything else.


(Some browser screenshots from March 1995, when grey page backgrounds were all the rage)

The March 1995 issue of ".NET" was particularly focused on music, with lots of information about music and bands on the net. Even back then, it was pretty obvious that the music industry was going to have to change. A site called The Internet Underground Music Archive (IUMA) was already offering web-space to unsigned bands so that fans could download their music directly. The magazine gives their site a good review, but notes that:
"... the songs themselves are available as stereo MPEG audio files, but file size runs at a whopping 1Mb per minute, so a four minute tune would take approximately a zillion years to download from the UK."
Although the bandwidth wasn't there yet, people could still see where the future lay:
"IUMA subscribes to the growing view, that CDs will soon be compiled at home, with the music downloaded through fibre-optic cables to recordable CD-ROMs. 'This will represent the future of music distribution,' agrees Michael Stone, an LA lawyer who has provided free legal advice for IUMA. 'Once fibre-optic cables are hooked up to every home, you can kiss the record and video stores goodbye.'"
Pretty prophetic ... well, except for failing to predict the rise of MP3 players, the fact that broadband speeds can be achieved without fibre-optics, and that people do still kindof like buying CDs and going to record shops.

Finally, there was an interview with Ricky Adar, who in March 1995 was preparing to launch his Cerberus Digital Jukebox system. Cerberus was basically iTunes Store - an integrated online music purchasing, downloading and playing system with built in DRM (except they didn't call it DRM back then). But this was 8 years before iTunes Store would be launched - Cerberus had done a lot of bleeding edge work: audio compression, DRM, online payment and negotiating deals with record labels and the Performing Rights Society.

As you've probably never heard of Cerberus, you can guess that this commendable cutting-edge initiative never really took off. They did launch, and became the first company in the world to distribute music over the Internet legally. But they were too far ahead of their time - people were not ready to play music through their PCs then, and MP3 players were years from becoming commonplace. In an interview in Sound on Sound magazine in Feb 1999, Ricky said: "The truth of the matter is that we haven't had that many downloads. There is a market on the web -- but we don't think it's a market that's going to mature as fast as we'd like". Around that time Cerberus also experimented with Virtual Record Store machines in shops and cyber-cafes. Customers could choose the tracks they wanted, pay the machine, and get a CD-R burnt for them. They were in Levis stores for a bit, but it didn't take off.

Reading about Ricky Adar left me wondering what he was doing now, after such a promising startup failed to start-up. Googling him didn't turn up anything recent, so I imagined him being eternally grumpy somewhere and spending his time smashing up iPods with a hammer, having given up on the software industry. But then I found this blog post, and it turns out Ricky Adar was a pseudonym, and Richard Faria is now CEO of tenomichi.com, purveyors of some very complex looking 3D tools. As he says in a comment on the blog post, "Lets hope I'm not ten years too early again!".

Monday, January 7, 2008

Web Services between Dot Net and Not Net

If you've only ever worked with Web Services in Dot Net, you could be forgiven for expecting it to be easy to use Web Services to interface with other platforms. In Visual Studio, it's all a bit Fisher-Price: you define your Web Methods, then add a Web Reference to the client and everything ticks along nicely. You don't even see any XML.

Recently I've been working for a client getting a Dot Net Web Service to work with a third-party system build in Perl. I have now discovered there are two sorts of web services:
1) Mickey Mouse Web Services in which the client and server both run Dot Net
2) Proper, Serious Web Services in which the server runs Dot Net and the client runs Not Net (anything else).

Other people in the industry seem to have noticed this too, and the current official term for Proper Serious Web Services is 'Interoperable Web Services'; that is, Web Services That Actually Operate.

Other people have already written lots of advice for building Interoperable Web Services. Here's a few articles I've found useful:
Returning DataSets from WebServices is the Spawn of Satan and Represents All That Is Truly Evil in the World (from Scott Hanselman's blog)
Top 5 Web Service Mistakes (by Paul Ballard, at theserverside.net)
Top Ten Tips for Web Services Interoperability (by Simon Guest at Microsoft)

One piece of advice that keeps cropping up for building Interoperable Web Services is to build them 'Contract First'. The teams working on the client and server ends of the web service get together and agree the XSDs that define the request and response of each Web Method. This irons out any problems with supported or unsupported types at an earlier stage. Code is then generated from the XSDs (or the WSDL) rather than the other way round.

The third party people we were working with suggested the contract first approach so we ended up swapping XSD's back and forth. This worked pretty well. When it came to code generation, Visual Studio 2005 provides two command-line utilities, xsd.exe and wsdl.exe for generating code from XSDs or WSDL files.

We went down the XSD.exe route, as we didn't have any tools handy for building the WSDL from scratch. We then built the Web Methods using the objects that had been generated by XSD.exe. This helped somewhat, but we still had a series of 'interoperability' problems getting our Interoperable Web Services to interoperate. I'll describe some of them to give an idea of how inoperable interoperability can be:

Problems With Blank Namespaces

We had a proper Namespace for the Web Methods (which was defined using a [WebService(Namespace = "something")] attribute on the class that held the Web Methods) but some of the classes generated by xsd.exe had XmlRootAttribute() attributes that specified a blank namespace, like this:
[XmlRootAttribute(Namespace="", IsNullable=false)]
It turned out that this lead to the web service expecting a blank xmlns attribute in the incoming SOAP message, like this:
<soap:Body>
    <ExampleWebMethod xmlns="somthing">
      <ExampleRequestStructure xmlns="" >
            ...
      </ExampleRequestStructure>
    </ExampleWebMethod>
</soap:Body>

When Dot Net tried to call this web method, it had no problems (because it was following the WSDL exactly and so put the blank namespace in). But the third-party PERL guys were hand-coding their request code and were getting tripped up by the lack of a xmlns="" in the SOAP request they were sending. Eventually we figured it out and removed all the blank namespace definitions from the code, so that the XmlRootAttribute looked like this:
[XmlRootAttribute(IsNullable=false)]

We also had to change some of the XmlElementAttribute attributes from
[System.Xml.Serialization.XmlElementAttribute(Form=System.Xml.Schema.XmlSchemaForm.Unqualified)] 

to
[System.Xml.Serialization.XmlElementAttribute()]


(If we had used targetNamespace in our XSDs, or build a WSDL file and generated code from that, we probably would have avoided this issue)

Problems with SOAPAction

In SOAP 1.1, there is an HttpHeader called SOAPAction that is supposed to get sent along with the SOAP request XML. That is, if you browse to a Dot Net asmx file using a browser and look at the example SOAP 1.1 request, the top of it looks like this:

POST [webservice url] HTTP/1.1
Host: [host]
Content-Type: text/xml; charset=utf-8
Content-Length: [length]
SOAPAction: "[namespace]/[method name]"

... then all the xml stuff ...

where the bits between the square brackets [ ] are filled in with the right values.

SOAPAction is a bit odd because it plays an important part, but its not actually in the XML bit of the SOAP request. It's there so that the server can route the request to the right method without having to actually parse the SOAP to find the method name. But XML fans were a bit miffed about having an important part of their SOAP system not actually in the XML at all, so it was dropped from SOAP 1.2.

The problem we had with SOAPAction is that although it is mentioned in the official WSDL 1.1 definition, no specific format is defined. And guess what?

  • Some platforms, such as CGI web services and PERL, use "<namespace>#<method name>" as the format, e.g. "something.com#test"

  • Dot Net uses the format "<namespace>/<method name>", e.g. "something.com/test"
The Perl guys though we were doing it wrong, and we weren't sure why our Web Methods were insisting on a backslash instead of a hash, and we couldn't find any definitive definition of what format it was supposed to be in, and it went back and forth for a while. It turns out the SOAPAction in Dot Net can be overridden by manually changing the WSDL. On the other hand, there was a little line of Perl code that got Perl to use the Dot Net format for SOAPAction and that eventually solved the problem.

If you're using Perl's SOAP::Lite library to call Dot Net web services, this page may help you:
Simplified SOAP Development with SOAP::Lite at PerfectXML

Problems with Geography

The Geography problem was that we had two different teams in different organisations, trying to work together to solve plumbing issues in the low level Http and SOAP. In the end, most of the problems turned out to be pretty trivial. However, because of the time lag between us putting up a new version of our web services, and the other team trying to call them using Perl, and then getting back to us with the results, what should have been trivial troubleshooting took days.

In the end, even though we don't know any Perl, installing it on our local network and running the test code that the Perl guys had provided proved to be pretty helpful. The benefit of being able to run the Perl code in-house whenever we wanted and seeing the results immediately actually offset the cost of not having any Perl skills. When we did need to change the Perl code, a bit of googling always led us in the right direction.

So there's probably a more generalised lesson there: When you're in a situation like this, even if you've never used the other platform, it's likely to be worth setting it up locally just so ideas can be checked and tested in one location instead of two locations having to work together.

Labels: