December 19, 2005

Good Threaded Code.

I've recently upgraded my main x86 development machine. After a lot of to-ing and fro-ing I decided on an AMD Athlon 64 X2 processor. It has become increasingly clear that the wave of the future is multi-processor / multi-core. Intel and AMD are both talking about 8 core processors before the end of the decade, Sun have just release their Niagara core, Microsoft/IBM are using a 3 core Power PC CPU in the new XBox and Sony/IBM have produced the Cell processor initially going multi-core in the PlayStation 3.

With the advent of consumer level multi-core processors in both home computers and consoles it is becoming clear that we are all going to write more in the way of threaded and concurrent code.

I for one am looking forward to having all that processor power, but as usual the question is how are we, as developers, going to make best use of it?

A part of the solution will be the increasing rise of APIs and frameworks such as that by Doug Lea http://gee.cs.oswego.edu/dl/classes/EDU/oswego/cs/dl/util/concurrent/intro.html for Java. By providing well designed and well tested concurrency code, programming multi-threaded applications will be simplified.

Application Servers will take up part of the burden, but at the usual price of lowered performance and increased cost.

I suspect that certain diagrams will become worth their weight in gold such as UML Activity diagrams.

Lesser known concurreny paradigms such as spin-locks will become more widely understood.

The major expense in using these multi-core systems will be in the synchronization points when different threads / processes exchange data. The skill will be in minimising the impact of the synchronization points.

I'd recommend all developers / designers / architects intending on making any money over the next decade to get their hands on this new generation of SMP machines as soon as possible and start understanding the complexities and opportunities.

October 20, 2005

Neil negotiating with Meeraj...



Poor Neil, another day in the Voca offices and another day trying to get Meeraj's approval for a code change.

August 20, 2005

The First Post-Industrial Technology?

I was sat with Neil Ellis one lunchtime last week, we were discussing how and where mass-production techniques could be applied to software.

I explained to Neil why I belived that this couldn't be done.

I feel that the point of mass-production is to reduce the cost of producing copies of a prototype.

In car manufacturing, for example, even the simplest of prototypes for the cheapest of cars cost upwards of £100,000. Millions of pounds will be spent fitting out a production line. After applying mass-production techniques, the copies will be sold for a twentieth of the cost of the original.

With software we are in a very different space, the cost of mass-production is essentially zero. All the cost is in the prototype.

Building a car prototype involves designing components, testing them, putting them together and testing the whole. Engineer will often design a car prototype re-using components from other vehicles and will design components to be re-used. Does this sound familiar?

The advent of devices that will print components and eventually nanofactories mean that the cost of mass-production will shrink down to purely that of the raw materials and the energy.

This means that we may find that trends in software development may be pointers to the future based on these new devices. I wonder what an open-source washing machine will look like?

July 31, 2005

OptimalJ - My Review.

I've been holding off on this entry for some time. I wanted to wait until I left the place where I was using it so that I could feel able to be completely honest.

I'll be giving an user's eye view of it both from the perspectives of an architect and a developer, I'll talk about designing using it and developing with the artifacts. I've been using OptimalJ 3.2 which is a relatively old version, newer versions will have addressed some of the problems that I'll be raising. I'll mention any fixes that I am aware of.

Model Driven Architecture is yet another attempt to increase the application complexity while reducing development difficulty by drawing picures (usually UML) to define the components and code generation to produce them.

OJ contains a set of UML modelling tools, which are separate and distinct from the MDA tools. It did not seem to be possible to move the analysis and design done using OJ's own UML tools directly into the MDA tools. On the project that I was working on we used Rational Rose to do the analysis and design, before using OJ's MDA tools. The inbuilt UML tools were inferior to those in Rose.

Using OJ to produce the MDA models is little different from using a class modeller in an UML tool, defining classes, attributes and methods. Unfortunately aside from the class modelling most of the rest of the process is about walking through wizards or setting properties. A lot more thought could have gone into using more UML diagram types. For example when one wants to define dependencies between services, one has to add to the 'UsedComponent' property values. These kind of dependencies could be easily defined using collaboration or sequence diagrams.

The three major 'models' that OJ works with are the Domain Model (used to define the domain objects and services), Application Model (fleshing out the domain model and getting quite platform specific) and lastly the code model which is the generated code.

The following layers are defined by OJ by default in the Application Model:

  • DBMS - The physical data model.
  • Common - cross layer objects such as DTOs (in OJ speak UpdateObjects and DataObjects), OJ enumerations and structs.
  • BusinessLogic - if I need to explain this one to you, perhaps you shouldn't be reading this... Well maybe I should explain one thing; BusinessLogic lumps Entity and Session Beans together and does not attempt to guide one down more structured approaches such as using the Session Facade pattern.
  • BusinessFacade - a curious set of auto-generated facades that will try to use UserTransactions if you don't watch them very carefully. Mainly useful.
  • Presentation - auto-generated struts forms and actions that are only really useful for data entry and prodding the services. In more recent versions of OJ a workflow designer has been added which will hopefully make this a lot more useful.

Out of the box one can only really architect J2EE/EJB applications, persistence only uses entity beans.

In practical use, OJ gets increasingly sluggish as the application increases in size. For what I would call a medium-small application we needed to wait anything up to a minute while OJ digested simple property changes; essential tools that check and update models that needs to be run frequently needed 10 minutes or more. It is also incredibly memory hungry, architects' machines were running with 2 gigabytes of RAM.

These problems can be managed by splitting your application into independent subsystems.

The code generation produces code that is split up into free and guarded blocks. Developers are expected to place business and application logic into the free blocks. I personally believe that weaving hand-cranked and generated code together in this way is a bad idea:

  • It makes re-designing and re-factoring unnecessarily difficult as OJ is not properly joined up. If you change a class name or package, the developer code gets put into a 'recycle bin' and needs to be retrieved.
  • The quality of code generated by the default patterns is more than a little suspect and makes it very difficult to get useful data out of reporting tools such as findbugs and checkstyle.
  • It means that the model is not the thing. Not only do you have to check the model into your repository, you have to check in large swathes of generated code.

The code produced by the default patterns does not inspire confidence in the Java abilities of the people producing the application. For example a form of dirty marker pattern in provided to support the 'UpdateObject's (DTOs) produced in the common layer. Unfortunately it is one of the worst implementations that I have ever come across. The authors seem to have so little understanding of encapsulation that they require a developer to actually call a method to set a changed flag for each field manipulated. When one looks at an UpdateObject's interface, one sees nearly the entire workings of the OJ dirty marker pattern marked public for all to see and it is not a pretty sight. I was so outraged that I was very tempted to go and find the authors and slap them till they promised never to produce something like that ever again.

The version that I was using had one other significant failing, the model-merge functionality that allows multiple modellers to work simultaneously was broken and this introduced a major bottle nect.

Out of the box OJ falls into negative ROI. Any modern project using open source code generation can out-produce it.

Reading over what I have written it would seem that I hate it with a passion. That would be untrue. The potential is huge especially when one realises that one can rewrite the meta-models and patterns.

I would only recommend OJ to an organisation that had significant up-front time to invest:

  • Take the time to really understand OJ's capabilities
  • Improving the meta-model to provide a better breakdown of the business logic tier,
  • Throw out many of the patterns and introduce new ones to use callback or dependency injection to move developer code out of the generated code.
  • Produce patterns that make use of a wider range of technology.
  • Write a decent dirty marker pattern.
  • Wait for Compuware to make the generation steps full scriptable in ant.
  • In monster computers for its architects.

Once all this had been done, then a positive ROI should materialise.

July 19, 2005

Another time to use dependency injection?

I was talking to some friends this lunchtime, explaining IOC containers and what dependency injection could mean for them.

A though came to me. What if IOC containers could also inject dependencies when objects were deserialized? Objects could behave completely differently across tiers. Say an object has a dependency on a persistence interface. On the middle tier the object would get stored to the database buton the client the object would get serialised to the middle tier for storage.

July 09, 2005

Chilidish Men? Or Childish Women?

I'm sure that the women in my life will give me a lot of grief for this entry, but here goes...

I'm a contentedly childish man, I like my toys, I like cartoons; generally I like having the time to play from time to time. When I have to be I can be as adult as necessary, dealing with difficult things in personal or professional life.

What I have noticed over the years is a certain undercurrent in our society where certain of the feminine elements belittle men because of the very form of childishness that I acknowledge in myself. Expressions such as 'He's such a big kid.' or 'Boys and their toys...' are a mild form of this.

I have started to wonder however whether women are any more grown up than men. I think that they are just better at public relations.

When a boy is playing with toy cars and guns he is being childish, when a girl is playing with dolls, makeup and dressing up they are preparing for adulthood.

This transfers forward to later life, when women spends hours trying on different outfits or being pampered at a beauticians - that is adult. I wonder whether are really just reverting to childhood.

June 30, 2005

Oracle OCI JDBC Driver problems

Yesterday, I spent a couple of hours trying to work out why we were getting an UnsatisfiedLinkError on the OCI8 libraries when we were running the 9.2 JDBC OCI drivers against an Oracle 9.2 database on Windows Server 2003.

After a lot of head scratching it turned out that the application's ojdbc14.jar and the OCI libraries were mutually incompatible despite appearing to be for the same point release of Oracle.

The application's driver jar was the one that was checked into CVS, whereas the libraries came as part of a new install of Oracle. When we used the driver jar from the new install, our problems went away.

It is definitely worth remembering to always use the driver jars that came with the libraries that you are using.

June 23, 2005

Fast Web Applications - Design thoughts

Whenever I design web applications there are a number of steps that I go through in order to achieve decent performance. Most of these you will all know about, however there is one thing that I do that does not seem to be well recorded in any book or online article.


Most of you will already have thought about static and dynamic content, in a java web application that will probably mean that there will be a set of HTML pages and images on the apache server and JSPs or Servlets on a Servlet engine.


I have a third classification of content that is very useful to think about: Semi-static.


Semi-static content is content that is data-driven (like dynamic content), but the underlying data changes relatively infrequently compared to the number of times that the content is viewed. Good examples would be a daily graph of stocks or a airport flight information page that is displayed across hundreds of displays.


There are various ways of dealing with this type of content, but they all come down to one thing - caching. Many people when they come across this type of content end up creating a jack-of-all-trades servlet that manages the caching and invalidation of this content. I tend to favour a different approach. I like to generate this content using a template engine (or equivalent for images) to the file system behind my web-server and let the web-server and the downstream caches do their work. The template engine should be event driven, I would normally use a lightweight JMS system to do this.


With a little careful configuration you can leverage the strengths of HTTP to allow an HTTP HEAD request to be issued, checking whether the browser cache, proxy cache or web-server in-memory cache has the up to date version and serve that directly from the cache or retrieve the new version from the web server file system accordingly.

May 23, 2005

A True Hybrid

On the project that I am on at the moment, we are following an increasingly familiar pattern. A primarily relational database with one or two XML fields. The XML fields store either data that requires an extremely flexible structure or data that is very object-oriented.

I was wondering whether this signals the need for a new kind of database. Relational databases answer many challenges bu not always perfectly, Object databases are often less performant but meet data requirements well. Maybe it is time to bring the two together and create a true ORDBMS, not the type that Oracle tried but one where a relational database has object fields that are in effect mini but true ODBMS instances.

March 29, 2005

Shiny New Toy on the Horizon...

Finally it looks like we might be getting something that I have always dreamed of.


http://java.sun.com/developer/technicalArticles/Programming/mvm/


I (among many others) have long wanted a way to reliably use a JVM as a true Virtual Machine. To be able to run multiple applications with only the overhead of starting one runtime. It should speed up Swing applications and make ridiculous java startup times a thing of the past.

It has always been feasible to go some way to achieve this kind of thing using a custom java application. This application would take several parameters at the command line:

  • The classpath for the application class to run.
  • The name of the application class to run.
  • Any parameters to be passed to the application class.
When the custom application class was run it would try to connect to a local socket trying to find a running instance of itself. If it found a running instance it would communicate the parameters for the application to be run over the socket and terminate. If it didn't find a running instance it would instantiate a classloader with the relevant classpath, reflectively instantiate the class from the new classloader and run it's main method, passing in the specific parameters. It would then continue running, listening on a socket for any more applications to run.


While this approach does a very effective job of reducing the footprint of running applications, it does not address the startup time (a new JVM instance is started for every application) nor does it properly isolate the applications (locales, system properties, System.exit() all combine to throw a spanner into the works).


I'm looking forward to playing with the research release of the MVM when it becomes available.

March 16, 2005

Evolution and Religion

Sorry about the length of time between posts, but I've spent an awful lot of time thinking about this post.

Recently I've been aware that several of my friends and acquaintances fall into one of two camps. One camp is the believers who feel that the theory of evolution is an argument used by others to attack the notion that God exists. The other camp is the sceptic who believes that the theory of evolution can be used to deflate certain of the 'absurd' beliefs endemic to most religions.

I happen to fall into another and much quieter camp, those who believe both in a religion and in evolution (whether Darwinian or a more modern form).

So, you ask, how do I reconcile the two?

My first thought is that God tends to use the most effective and elegant means of addressing his will. To my mind that means that there is no reason why evolution isn't the technique that God chose to use in creating the many and varied creatures on this planet.

Secondly I tend to take very seriously the assertion that God stands somehow outside of our conception of time. This means that to God it makes no difference whether it took 4 billion years of our time for us to evolve or we appeared in an instant. What matters is the result.

Something like evolution is almost required, because one thing that most religions are clear about is the importance of faith. Without evolution or its like, we would be able to point and say we know that a supreme being created us. Where would faith be then?

I also believe that there are reasons why we were created with intelligence; one of them is that we can start to approach an understanding of God (however slight an understanding), by investigating the marvellous universe around us..

March 01, 2005

J2EE Miconceptions - Part 2

Another popular misconception is around SQL and JDBC. I find that an awful lot of database specific SQL get written and a lot of it occurs for no good reason.

I would suggest that you go and read the 'Scalar Functions' section in the JDBC specification and then review all your SQL. I think that you will find a lot of the database specific SQL could be rolled up into generic SQL if you used these functions.

JDBC 'Scalar Functions' are by no means a panacea, but with judicious use you can seriously reduce the amount of DB specific SQL you are using which in turn will reduce your maintenance / migration burdens.

February 28, 2005

New found liking for dessert wines.

I have just had a bit of an education...

I've just spent a weekend on the Isle of Wight with my sister; Saturday afternoon was spent sharing a significant part of my whisky collection with some old friends. Chris (who happens to be a very talented musician) returned my collection of CDs that I accumulated on my '03 roadtrip and told me how much he had enjoyed 'Los Hombres Calientes'. I'd forgotten that those CDs were in that particular part of my collection, but was very please to know that they had given him so much pleasure.

After a couple of hours drinking whisky, we drifted across into discussing wines. I stated that I had never come across a dessert wine that I liked, with the honourable exception of the occasional Italian Vinsanto. I declared that they were all too sweet and sickly.

Rather well lubricated by this stage, Chris returned with a wax-sealed bottle and introduced me to 'Royal Tokay'. I had heard of 'Tokay' before, but today was the day when I discovered that it was a tiny wine producer in Hungary, costing £30 for an half bottle of the 'Royal Tokay'. I'm not a wine writer so I won't even try to describe what it tasted like; but after cleansing my palate with a little water, I enjoyed one of the finest flavours that it has been my pleasure to experience.

I am now on the look out for Tokay, and I want to try the 'Imperial'.

February 20, 2005

The Speed of Light?

THe night before last I must have been having some corking dreams, because I woke up this morning with a question that I would love an answer to.

'How does the quantum foam effect the speed of light?'.

For my readers who are not physicists, let me break this question down.

The 'quantum foam' is a descriptive term for the fact that there really is no such thing as a vacuum. Due to the probabilistic nature of quantum mechanics even in a space where all matter as we recognise it has been removed, particles continually pop in and out of existence. Energy is conserved because these particles disappear as rapidly as they appear.

This being the case, I wondered whether the speed of light is in some way determined by it's interaction with this quantum foam.

A side query is whether the increase in mass associated with objects moving relatively faster (most obvious as the speed of light is approached) is in some way connected with interactions with this foam. as masses move faster relatively speaking, they sweep out more and more space, vastly increasing their chance of interacting with these virtual particles.

February 12, 2005

Singleton - Pattern or Anti-Pattern?

It's interesting to see how various developers respond when you ask them about the singleton pattern. Most react positively, others say that it is an anti-pattern of the worst form.

Why it is considered an anti-pattern by some?
  1. In Java it doesn't always meet the contract of the 'Singleton Pattern' as defined in the GoF book. Classloaders mean that more than one instance of a Singleton can exist in the same JVM.
  2. Developers often use it as a dumping ground for de facto global variables, harking back to the most unstructured early days of programming.
  3. Implementations often feature 'Singleton' in the names of implementing classes and other code becomes dependent on the fact that it is an instance of the singleton pattern.

I believe however that there is nothing wrong with the singleton pattern, just in implementations of it.

One should think of the singleton pattern as an implementation pattern for the factory and pool patterns. Developers should be handed a factory or pool and told only that it will give them an instance of a particular class or interface on request. They shouldn't know that it is the same object instance each time.

This allows the developer of the pool or factory to make efficient use of available resource, if possible, but also allows the implementation to be changed if necessity dictates. By hiding the usage of singleton behind the pool or factory patterns, code fragility is avoided.

February 11, 2005

J2EE misconceptions - Part 1 (of quite a few actually).

On every project I never fail to be surprised at how little supposedly experienced J2EE developers actually know. This week I found myself asking team members if they actually had read the EJB specification and the answer came back a resounding 'No!'.

It seems that all too often when people are learning J2EE and EJBs in particular, they go and look at the specifications with a beginner's eye and form the impression that the specification documents are dense and impenetrable. They never go back with a more experienced eye and have another go. When I went back and re-read the specifications half way through my first J2EE project I suddenly realised that they were really useful and were packed full of vital information that allowed me to treat J2EE more as a science than as the black art practiced by so many.

The misconception that I want to address with this post is around Container Managed Transactions (CMTs). Let me ask you a question and please be sure what you think your answer is before you move on.

In what circumstances do CMTs roll back?

Take a second or two.

The answer that I usually get is: 'When an exception is thrown.'

Trouble is, the answer is sort of wrong.

The correct answer is: 'When an unchecked exception is thrown out of the transaction context or when EjbContext.setRollbackOnly() is called.'

What many of the developers I talk to fail to realise is that checked exceptions do not roll the transaction back.

So did you get the right answer?

February 03, 2005

UML Abuse

Had a mixed day at work today, good and bad.

The bad today was around a discussion of how we were going to use UML to analyse use cases. All was going well until we got on to discussing how to model the user interface. We need to record the navigation between screens, the decision made was to do this with a mixture of class and statechart diagrams. I argued against this and suggested Visio, but was overridden on the basis that we wanted to do it all in UML via Rational Rose.

Thinking back I realise that I have actually seen this a lot, people determinedly using UML when another style of diagram would have been more appropriate. I seem to recall that Jacobson, Grady and Rumbaugh, when they first wrote up UML, were quite clear that while the diagrams covered the majority of design cases, they by no means covered them all. Where has this attitude gone?

Does anyone else see UML abuse as a problem?

February 02, 2005

Name Dropping

I had an interesting day yesterday, some guys from JBoss came over to sell a support package to the project that I am working on. So I found myself, just before lunch, shut in a meeting room hearing the pitch made by a commercial venture on the back of open source software.

I can certainly see that there is money to be made by their business model. In commercial and public sector there is the concept of 'due diligence' hereafter known as 'arse covering'. The support packages on offer provide exactly the right feeling of security for management and customers, in this case backed up by genuine knowledge of the product.

Certain of the more rabid open source fans I have met believe that this is a dilution of their long term goal. At present I have to disagree, because of this commercialisation the rate of development in JBoss is extremely high and that benefits anyone who needs an application server and can't afford the cost models imposed by companies like BEA. I don't know whether a successful business can be established long-term by paid for arse covering, but we will see.

After the meeting I adjourned to the cafeteria to share a coffee with the gentlemen from JBoss and had an interesting time talking to one of the more vocal EJB3 supporters that I have come across. He was very clear that the current 'issues' with JDO are all the fault of the JDO specification team and how they 'snuck out' the candidate specification over Christmas. I certainly agree that it was not a time best calculated to make the right impression. I have a suspicion that the actual cause of the release date was the feeling 'Right let's get it finished and have a good Christmas.'

I have to admire the detachment displayed in discussing how the current situation was in part due to the frictions between a specification driven by large business and a specification driven by a thriving user community. I found myself agreeing that there was room for both specifications. Having more than one persistence specification allows me to exercise one of my dictums: 'The right tool for the right job.'.

My radical proposal to resolve the current problem is to go ahead and put the mechanisms in place for migration from JDO and EJB3, but also state that mechanisms for the reverse migration should be part of the EJB3 specification.

January 28, 2005

Motorway Driving.

With my current contract, I'm having to do a lot of motorway miles, to and from work. I tend to odd hours and so tend to avoid the worst of rush hour. However there are a few things that I need to get off my chest...

I hate:
  • Under-takers - Not the sombre men that come to take away granny, but the fast furious drivers that come up inside lanes past a queue of cars. When they get to the slower moving obstacle in their lane that has caused the stack of cars in the outer lanes, they force their way back out causing everyone else to brake.
  • Underpowered White Van Men - White Van man is an acknowledged hazard of modern driving, but in his way he is predictable and possible to get along with. The only one I have any problems with are the ones with heavy loads and small engines. They zoom downhill in the outside lane at 80 mph. I have no problems with them zooming, it's when they go uphill and they use that brief spell of 80mph to justify remaining in the outside lane as they crawl uphill at 60 mph or below.
  • Sticky Drivers - I define a sticky driver as someone who stays in a lane regardless of whether the inner lane is empty. They are the cause of a lot of dangerous driving, either because a marginally faster driver is forced to overtake them, or because a significantly faster driver is forced to undertake them, or (worst of all) an impatient driver trying to persuade them to move drives right up their backside.

There are many minor irritations on the roads; roadworks, speed cameras and potholes; but it seems to be a constant that other road users cause the most irritation.


January 20, 2005

First impression of OptimalJ

Well, I've been using OptimalJ 3.2 for 4 days now and I can say that my first impression is 'stability' or more to the point the lack of it.

Part of the problem is that it is based on 3.5 of Netbeans, which was never noted for it's fabulous robustness, but a large amount of the instabilities that I have encountered are in the OptimalJ functionality. This is a tool that promises to give great productivity enhancements, but to be honest they have all been soaked up by the time spent trying to work around the issues.

This instability reflects on an idea that I've been kicking around for some little time now. The limits of complexity that can be achieved in a piece of software. The improvements in development; assembler, functional programming, procedural programming, object orientation, UML etc.; have all increased the complexity of software that can be produced.

I get the feeling with OptimalJ that this particular software is pushing the complexity limits of the development processes used to produce it.

January 15, 2005

PVRs, what I really want in one now.

Been chatting with my friend Dan about my upcoming purchase of a Hauppauge DEC 3000-S USB Satellite decoder. We were uncertain about the lack of an hardware encoder for recording until we realised that Digital Video Broadcasts (DVBs) are encoded in MPEG II in the first place.

I got to thinking about what it is that I want in a PVR beyond what is currently available and the major one is being able to view one channel while another is being recorded. This is something that is available with the current analogue technology VCR and television as they each have their own hardware to decode the signal and either display or record it.

Ideally I would like a modular PVR being able to plug in additional receiving/decoding units, so that I can view/record across as many channels as I have decoding units.

Of course there are practical design problems, it would be possible to overwhelm both the CPU and the filing system with the amount of data being written. Within those practical limitations I would love to be able to record 3 or 4 channels at a time.

I suppose the next thing I need to do is look at what can be cobbled together now with available technologies. The easiest thing to prove the concept will be to use two USB compatible decoders. Next will be to sort out drivers and software capable of dealing with two separate streams. I think that I'll probably have to start with MythTV and carry on from there. I know that the Hauppauge software won't cope.

Anyone reading this with ideas, I would appreciate your feedback.

January 12, 2005

Scrambled Eggs - My Way.

Well, this is going to be my first cooking post; I've been cooking since I was 7 when my mother first came down with M.E., it was only baked beans but I have come a little way since then.

It has taken me quite some time to perfect my scrambled eggs recipe. The quest began when I realised that the white of an egg and the yolk of an egg need quite different cooking regimens to achieve perfection. Undercooked whites are slippery and unpleasant, overcooked yolks are dry as dust and taste vaguely sulphurous.

So here is how I cook my scrambled eggs now:

Ingredients:
  • Eggs, 1 or 2 per person depending on famishment and size of egg.
  • Milk, about 50 ml per egg. I prefer organic full fat for this but it does work with semi-skimmed.
  • Vegetable oil (I prefer sunflower oil).
  • Seasoning

Put enough vegetable oil into a small saucepan to cover the base and swirl around the sides. Separate the egg whites from the yolks, putting the whites in the saucepan and the yolks in a bowl with the milk.

Place the saucepan containing the whites and vegetable oil on a low heat. The intent is to cook the whites without frying them to crunchiness. While the whites are cooking whisk the yolks and the milk together.

When the whites have solidified, you can take the opportunity to pour away any excess oil from the saucepan. A little oil gives sheen and prevents sticking, too much oil is just plain greasy.

I now tend to use the edge of a wooden spoon and break the cooked whites down into pieces. The size tends to reflect how 'refined' I'm feeling. Now add the yolk / milk mixture to the pan and return to the heat.

How high I now have the temperature tends to reflect how much attention I am prepared to pay to it. If I'm in a hurry and am prepared to stir continuously I will up the temperature; if I want to pay attention to cooking other things, I will leave it on the lowest possible temperature and stir occasionally.

Before long the scrambled eggs will start to go 'blop' and steam slightly, this is the signal that you have to pay attention and stir to stop it sticking to the pan. Hereafter it is up to you as to how thick and well set you want your scrambled eggs, the longer you cook it the harder they will get. Don't forget that they will continue to thicken slightly after you take them off the heat.

Serve the scrambled eggs and only now apply seasoning. If you add salt any time before this you risk the eggs curdling and separating giving you a texture of grit floating in tasteless water.

I personally like my eggs served either on wholemeal pitta bread or a nice malted wholegrain toast. Cooking up a few bacon lardons and sprinkling them on top can look really smart too.

The final thing to do is enjoy eating them!

January 10, 2005

A chance to experience MDA.

'Model Driven Architecture'.

There has certainly been a lot of discussion over the past couple of years about MDA. I have sat on the sidelines, an interested observer, wondering quite how well the whole thing will work.

Unlike most development approaches, the model is the primary artifact and not the source code. This leads to the following concerns:
  • The versioning of the model will be interesting. Depending on how it is stored and encoded. Remembering the nightmares with Rational Rose Models I'm not sanguine.
  • XMI and UML are not of themselves programming languages and so are not able to encode business logic. This means that a significant secondary artifact is the business logic code. The interaction between the model and the business logic will be interesting.
  • Tuning and customisation. The templates for conversion from the Platform Independent Model (PIM) to the Platform Specific Model (PSM) may be adjustable, but how far can you go? Can you apply specifc tunings/customisations to specific components?
  • The tools to accomplish this seem likely to be complex, what is the scope for vendor lock in?

I've tinkered with AndroMDA and had mixed results. However I will be working with OptimalJ now, so as I gain experience, I will try and record what I learn here.


January 07, 2005

Connection Pools - are they always a good thing?

Sorry about the hiatus, it was mainly because I had nothing that I wanted to say... I suspect that that is going to change.

I had a talk with a friend (hi Meeraj!) yesterday and we were discussing a connection pool that he was using. After certain queries anything up to 100 megabytes were consumed and not garbage collected. The theory that we arrived at was that the PreparedStatement caching was the problem. I don't know if that was the case, I hope that Meeraj will let me know, but that set me to thinking.

Why did we introduce the complexity of a connection pool to our code in the first place?

I don't just mean the complexity of writing them (that's mainly dealt with by DB vendor's own implementations nowadays); tuning them and ensuring the robustness of our own code around them can be an headache.

As I recall the major reason for pooling connections was the relative expense of creating the connection in the first place. This expense was only large compared to the time that a small query would take to run and be used. It made a lot of sense to reuse a single connection for several of these little queries. The longer running the queries the smaller the amount of time (proportionately) was spent in creating the connection.

So, for a system in which queries are long running and the connections tend to consume a lot of resources, does it make sense to have a connection pool? Would it be better to have some more simple form of connection management?

The 'meta' lesson is that even for the most frequently used patterns / components, we should always remember why they were used in the first place and recognise the situations where they add no value or perhaps even cause problems.