Fisi's overtake move on Coulthard [youtube.com]
If he can do this, why did he wait for the shame of being lapped by his teammate to pull it off?
It's a little reminiscent of some of Sato's old moves, but Fisi actually has the skill to pull it off. Of course if you're going to try a move like this, do it on someone as skilled as Coulthard (who you know will see you) and when it's not for position (so he would rather save his car than fight for the position). I'm not sure that it would have stuck against Schumi...
Software Framework Design Redux
This is going to be a little bit of a stream of consciousness, but I will probably build on it later.
First, two preconditions
- understand the business needs. kinda obvious, but you need to plan for it to be iterative, especially if the business needs are not fully known yet. (even if you think they are they probably aren't)
- understand the mindset and capability of your developers—what are their strengths/weaknesses, what conceptual models do they click with or struggle with
The interesting thing I realise about these points is that they define your two main points of interest, which are also your two "clients": the business; and the developers. Even if you are in exactly the same team as the developers who will be using your frameworks they are really your clients in the sense that your job is to enable them to be productive.
Goals when designing & implementing a code framework
- end result should allow the natural expression of the real world model such that it's easier to accidentally do the right thing than to make a mistake. this goal is symbiotic with the general goal of hiding complexity where possible.
- build in enough flexibility for accidental reuse without increasing the complexity of the mental model required to put the objects to use. (more on accidental reuse to come in a later blog entry).
- design for free improvement of end functionality via upgrades to the framework.
- extend goal (A) such that simply using the framework should result in expressive self documenting code. an aid to achieving this is to think of objects/methods in terms of their use rather than their implementation.
- internally design the classes/methods to allow for a change in the implementation of their own properties. eg. an identifier changing from an integer to an alphanum or a property being replaced by a method. these changes should be possible without resorting to an automated refactoring tool (although sed, awk and perl are perfectly respectable refactoring tools ;)
Observation: (D) & (E) are much easier to achieve with an OO agile language such as Ruby or even OO Perl than with traditional compiled languages. Objective-C and the core NS/Cocoa classes do a good job at bridging this gap.
Update: There's a useful article on API design just been posted on perlmonks On Interfaces and APIs. While not as concise as it could be, the referenced material is very good. Two that I would heartily recommend you read are:
The new world order of finance
Banks feel like cathedrals, I guess casinos took their placeInteresting thought. 10 years later it's almost a credible concept with the extremely fungible nature of online poker chips.
PS: Also noticed that there is no affiliates program for iTunes in Australia wheras there is for the US and UK - what's with that? I can link to iTunes for free, or I can give you a commission paying Amazon link to the Album...
The Cool and the unCool
I don't normally do a link roundup (that's what my del.icio.us feed is for), but there are a few links I've been storing up to discuss that deserve a few words.
- IBM z9 mid-size mainframe (links: IBM, El Reg)
Some people are desperate for cost effective scaling + reliability, but getting all three of these with some (small a) applications is actually an incredibly difficult problem. Exhibit A is the brain power that it takes to make a serious search engine scale. The rule of thumb is that you can pick any two. If you're less cost sensitive than the average bear and you have very serious scale and reliability requirements then you could do much worse than look at mainframe hardware. Especially now that virtualised linux environments on mainframes are so de riguer. For people used to the constraints of Intel hardware, the difference is really outstanding - imagine being able to suffer a cpu failure then physically install a new one - all while your servers keep running. I wasn't aware that IBM offered a co-processing module targetted specifically at J2EE - that's pretty cool.
- Reverse-capable C/C++ debugger for Linux (Links: Undo Software, LinuxPR)
Speaks for itself. What's really great is that they chose to use gdb as the front end. Not only does that reduce the barrier to learning how to use it, all the traditional front ends will likely work with little change. Now on Intel, it should also take very little effort for Undo Software to make a MacOS X version that would work seamlessly with XCode (which uses gdb already)
- Google Maps Geography Quiz
Test your knowledge of the globe with this trés cool mashup
- Tofu : Multi-column web browser for MacOS X
Newspapers use multiple narrow columns for a reason. Now you can enjoy your electronic reading experience in the same form-factor with this free (alpha) viewer. I've tested it on PDF and html files, I assume it just subclasses an NS class of some sort.
And now for something completely differentFrom the recent exellent PCWorld article The 25 Worst Tech Products of All Time:
- Microsoft Bob
This links to a separate article about Bob including screenshots and discussion of the performance on modern hardware! I think Microsoft had stolen too much of the Sculley Kool Aid!!
- Apple Bandai Pippin
I saw a mad demo of some prototype set top box software from Apple which seemed like a good idea that went nowhere. The Pippin on the other hand - that never seemed like a good idea to me!
Lunchtime Theatre in Melbourne
"$10 gets you some bread, soup and a bit of theatre. That's gotta be good!"
This year it seems to be in the Victorian Horticultural Society Hall with a piece called Treading Water.
Forcing Mail.app to resynchronise an IMAP folder
The cause was me force quitting Mail.app while it was in the middle of trying to make sense of the multiple huge imap copy/move requests I had given it.
The symptom was that when I clicked on an email title in that folder, a totally different message would open up.
Since all the other folders were fine and each folder in Mail.app is represented on disk as an mbox file, I figured that just that mbox file (and associated index) was corrupted. I didn't want to recreate the whole account in Mail.app since resynching my 3Gb email account would result in a lot of bandwidth (even though it is going over a compressed ssh tunnel).
A bit of poking around the ~/Library/Mail directory shows that each account has a directory named something like "imapUsername@imapHost", which may or may not be the same as your email. Inside that directory, there is a simple folder structure. An imap folder in your account named "FolderName" has a directory called "FolderName.imapmbox". If the imap folder also contains sub folders, there is an additional directory just named "FolderName" that follows the same directory structure. The "FolderName.imapmbox" directory contains an mbox file and a small plist file.
What I hoped would work, and in fact did, is that you can remove any given .imapmbox file (while Mail.app is NOT running) and it will be recreated and synchronised the next time you open Mail.app and visit that folder.
Sweet. It's just a pity that Apple doesn't provide a context menu item called "force re-sync" or something like that. Of course fixing the sorces of crashes that cause corruption would be even better, but you'll never get rid of all bugs, so a re-sync option would be usefull for the advanced punter who doesn't want to go typing rm -fr too often (did you get that Bruce?).
keywords for my American searching friends: resynchronize synchronized
Test Driven Development with Ruby [wiki.marklunds.com]
Unfortunately I was nowhere near Stockholm (or even the northern hemisphere) at the time, so I'll have to make do with the notes ;)
Judge slaps SCO council in SCO vs IBM
- SCO vs. The World2003 Including the original code analysis by Bruce Perens
- WE URGENTLY REQUIRE YOUR ASSISTANCE SCO parody of a Nigerian scam 2003
- Amusements from the SCO Group's latest filing2004 with whoppers such as SCO claiming that IBM had little or no expertise on Intel processors.
- SCO Asserts Its Rights to Almost Nothing2005
In the latest hearing, which included a motion to dissallow a surrebuttal to a rebuttal to a motion to dismiss based on ... sorry, I don't remember past that bit ... the action focussed on whether SCO had complied with the court orders and therefore whether it should be allowed to proceed with the parts of the case related to those court orders.
Throughout the morning the SCO council weathers some pretty heavy times - both from the IBM council and the Judge - but just put yourself in the shoes of the SCO council for a moment, and imagine the shiver run down your spine as this particular conversation with the Judge plays out (italics mine):
THE COURT: Let me ask you this: Is SCO in possession of -- can SCO provide additional specificity with regard to any of these items?
MR. SINGER [council for SCO]: We have had a couple months of additional work since December 22. It may be that on a handful of these items something has come up during that time period which would allow a more specific reference in
one place or another. But, in general, with what we're talking about here on methods and concepts, no.
THE COURT: Well, I guess what I'm asking you, basically: Is this all you've got?
Well, Mr Singer, is that all you've got? Of course the answer is yes - they don't have jack, but the keep on plugging away with the case, lest their share price fall when the investors realise that the emperor has no clothes.
Let the merriment continue (albeit at great expense to the plaintif IBM...).
Original court procedings via Groklaw as always. This quote from pages 75/76.
The evolution of programming languages (COBOL -> Ruby/Perl)
The gist of the conclusion of the article is this:
The dynamic languages make programmers so much more productive that even conservative business types are forced to sit up and notice. That's why I love Ruby on Rails, despite having not used it.
David Hansson, love him or hate, has created a killer app which is turning even diehard Java enthusiests to dynamic languages. There's a reason why Amazon, LiveJournal and Slashdot rely so heavily on Perl. There's a reason why Yahoo! decided to start using PHP. There's a reason why Rails is written in Ruby and not Java. I think we've finally hit the turning point where the economic forces at work are too great too ignore. Of course, Java will be around for a long time to come — COBOL is still widely used, for example — but it's simply math. The faster your programmers can turn out good applications, the more money you save (and can therefore earn).
I also love this quote from Randall Schwartz:
This power can be summed up in a response noted Perl guru Randal Schwartz made in response to a Java enthusiest (a student, I believe) asking him how he dealt with Perl's lack of "strong" typing. He replied "I just smile and move my program into production before the Java programmer has his first compile."
Truly smugness deserving of a Unix user ;)
Possibly the earliest record of me on the Interweb thingy
You'll notice my old university email address firstname.lastname@example.org and remember, this was before the first graphical web browser (that was late 1993 if I remember correctly) and the main way of using the net back then (aside from email) was telnet, ftp, usenet and gopher (plus the occasional talk/ytalk and nethack session ;)
Ah, memories :) I can distinctly remember how excited I was when the engineering lab installed Sun IPX workstations that had floppy drives, so I could download Apple software and updates from bric-a-brac.apple.com (their main ftp server back then) and take them home on a floppy...
Update: The oldest actual content of mine comes a few months later in September 2nd 1993, with this post to comp.sys.mac.system where I reply to someone asking a question about the ohh-so-new System 7, pointing them to a utility called PowerSwitcher. One of the good things about PowerSwitcher, apparently, was that it only used about 2k of memory. Not even viruses are that small these days!
Where RSS republishing meets plagiarism
Taking someone's personal blog entries, changing the names, and then inserting the stories into your own blog is kinda sad. The comparison made in that story to a 6 year old exaggerating their playground exploits is probably a fair one.
It is obviously wrong though.
Less clear, is the validity of republishing someone's RSS feed. For example, my RSS feed (and others) are republished wholesale on sites like http://ferrari.gooddigest.com/14.html - obviously with the aim of creating bulk content in the hope of driving clicks to their Google ads. (the original content on my site as racked up $50 worth of clicks since 2003, so I don't see how it will work for them ;). It seems wrong, but is it? That page attributes my site and provides a link. It says "reported by" which implies that I have submitted the content to them, but in a way I have - I provide an RSS feed and effectively say "subscribe to this feed - I have things to say just like AP does".
Another analogy might be that it's not really that much different to cable tv distribution companies retransmitting the free to air channels on their cable without paying a fee (which has been upheld in the courts here in Australia).
In this case, my branding is getting a hammering by having my content identified with such a shonky website (which has even worse colour schemes than mine!), but what about a more "legitimate" site, like [redacted]*, where my content becomes part of one of their feeds. It's not linking, it's actual digital duplication of my (copyrighted) content. But in the case of feedster, it's being duplicated by a service that is designed to help people find my content, so that's a good thing right? Before you assume the answer is no, think about how different that is to a Library, or Google books.
For a fascinating, in depth and really interesting read about Libraries and DRM (Digital Rights Management) check out the recent Groklaw post "The British Library - "The world's knowledge" DRM'd and for a price".
Any feedback on this? Semi? DB? Lars?
* NB: I removed the link and name of the site at the request of the new-owners of the URL who run an entirely different site. Glad to see the old one is gone, but the problem still remains - I defy anyone to count the number of websites that re-publish Stack Overflow! -- 2014-10021
Service Oriented Architecture (or is it?)
To me SOA is mostly standard common sense of system interoperability and work distribution (using distribution as a non-technical term here). The whole business of building "SOA enabling software" to me totally misses the point. There is no SOA stack. Using SOA principles in designing some systems will dictate the use of some sort of messaging server or similar, but I don't quite understand how using millions of dollars worth of software from Fujitsu or the like is going to enable me to "do SOA". Remember that the "A" is for Architecture - it's a way of designing something, not a thing in itself.
I like the article's description of Loose Coupling but the failed CD OOP analogy is yet another example of letting a new buzzword get away from you. The whole SOA concept is a nice encapsulation of some common sense ideas, and it is also a good thing that this buzz is encouraging vendors to open up internal functionality to allow better code and logic reuse. But the reality is that the reason noone has been doing that consistantly is because it appears (on the surface) to be good for business to isolate your systems from everyone else. These days you only do that if you want to cop some stick from the EU.
Ah - rambling this is. Thus I have diluted the comprehesibility of the term SOA a little bit more!
Check out this little doozy in the Wikipedia page for SOA:
One area where SOA has been gaining ground is in its power as a mechanism for defining business services and operating models and thus provide a structure for IT to deliver against the actual business requirements and adapt in a similar way to the business.
A little bit of AJAX (scrub scrub scrub)
Photo widget ajaxified
The main driver for doing this is that the photo widget was adding a significant delay to page rendering - especially if you were the first visitor to arrive after the cache timed out at the server end. Now if you are that lucky person, you will see a spinning wheel in place of the photos until the data is ready (and even that is quicker as well). The widget is built dynamically from 3 randomly chosen photos out of the full list of ids so that you will get a fresh set on reload even if you're sitting behind a hugely aggressive proxy named Roger.
Google ad box optimised
The other slight change to the front page is that the rendering of the Google ads is left to the bottom of the page, again to stop any delay in rendering the main content. Once the ads have downloaded and rendered (into a div with display:none) the rendered html content is copied up into the usual place.
Comments form buffed
I have also played with making the comment submission ajax-based like in typo. Yes I know I've made disparaging comments about the typo ajax comments interface. I still think it's totally unnecessary. This was really just a simple exercise for me rather than something I think is actually beneficial. You'll only see the new interface when you use the add comments link at the bottom of a blog entry which already has at least one comment. The normal add comment links are untouched.
Postgresql Array types [www.postgresql.org]
Using the SQL99 standard syntax, they can only be single dimensional and you have to specify a fixed dimension. The utility of fixed arrays seems to me as mostly a nice to have - you could, of course - emulate the same behaviour with columns.
Using the Postgres custom syntax however, you can specify multiple dimensions and leave the dimension unspecified. This seems to be a really useful extension that would suit many commen needs. I'm first thinking of a flag or parameter type of column.
Lets say that you are writing a more advanced job scheduler (think cron) and you have a db table where you're storing commandline arguments for the timed program executions. You could store that as a plain string to be parsed by the executing shell, but then you're losing information (and have to be insanely careful with quoting) and don't make the char limit too short. You could store the individual arguments as comma separated values in a string or text/clob field (how many times have you seen that little doozy in legacy code). Or you could have a seperate table for command line arguments, link it to the command table with a shared id, plus you would need an ordering integer so as not to lose the order of the arguments.
Or ... using Postgres array syntax, you could define a single column arguments as a variable dimensioned array of strings:
create table commands ( id integer primary key, command varchar, arguments varchar )You can then get all the command details with a simple select command, arguments (assuming appropriate array support from your database driver), you can find the number of arguments:
select array_dims(arguments) from commands where ...You can find every command that is called with both the argumets --ignore-errors and --destructive:
select * from commands where '--ignore-errors' = any (arguments) and '--destructuve' = any (arguments)Another neat thing is that you can cast a resultset into an array, so you could eg. very easily denormalize a complex one to many lookup relationship into an array column with a trigger.
This is an example of why I love the Postgres project so much - they are way ahead of the curve on innovation (being behind only Oracle - but even then only in some areas) and yet more standards compliant than any commercial or open source RDBMS I have come across. They understand that the power of of an SQL database is in the relational database concepts and pursue being a good RDBMS - rather than trying to be a database that "you won't get fired for using" a la MySQL or MSSQL.
Friendly blog links
So now instead of getting a permalink along the lines of /blog/one-entry?entry_id=1234 you'll get something like /blog/2006/05/05/mark-is-a-good-bloke (told you they were friendly).
This is going to play havoc with my Google analytics results. At least it's nearly the start of a month.
Shout out to Vinod Kurup who's code snippet I plundered and modified.
Next up, compile tDom into aolserver 3.
Sheesh - I should just upgrade to OpenACS 5.x, but hacking is much more fun ;)
Update: Since I have also changed the urls presented via my RSS feed, those of you following in feed readers will probably find that the last 10 entries are now duplicated. Sorry about that, but there's nothing I can do since RSS readers treat the url as a primary key of sorts.
Domain Driven Design
1. How design fits into the Agile process
"In fact [Agile processes work] best for developers with a sharp design sense. The [Agile] process assumes that you can improve a design by refactoring, and that you will do this often and rapidly. But past design choices make refactoring either easier or harder. The [Agile] process attempts to increase team communication, but model and design choices clarify or confuse communication."
And that is one reason why I would take a handful of handpicked developers I trust over a huge (possibly outsourced) mass of development teams. It's not even about education - it's some mix of intuition, aesthetics, a desire to learn and committment without stubbornness. One of the top 3 developers I have ever worked with from around the world has no formal tertiary education at all.
This next quote is a gem that I think applies to many iterative processes as well as software design. Preface p.xxv
"Exploration is inherently open-ended, but it does not have to be random."
2. A bad model (or no model) will bite you in the bum
"... Using a model in these ways can support the development of software with rich functionality that would otherwise take a massive investment of ad hoc development".
And there's the win. You just can't break through some ceilings of software complexity without a good model. Sometimes you can bludgeon your way through with a mass of ad hoc development. Other times your development team, number of lines of code, timeline and budget will scale beyond all reasonable proportions, such that no-one understands the problem any more, let alone the code, and you don't even have a language sufficiently powerful to discuss the problems.
I'll discuss the positive side of this equation in highlight (5) below.
3. Ingredients of Effective Modelling
- Binding the model & the implementation [early]
- Cultivating a language based on the model
- Developing a knowledge rich model
- Distilling the model
- Brainstorming & experimenting
YES YES YES! (pumps fist wildly in air) On reflection, every successful project I have been involved with has had all or nearly all of these points as strong elements of the design & implementation process. I have just never distilled them so clearly or heard anyone else succeed to that end. Fantastic!
4. The fundamental problem with the waterfall process
"In the old waterfall method, the business experts talk to the analysts, and analysts digest and abstract and pass the result along to the programmers ... This approach fails because it completely lacks feedback."
Evans again gets to the nub of the problem with the waterfall process - there's no feedback loop. Take a look at nature, there are feedback loops everywhere. Where there is no feedback, or the loop is too slow, there is brittleness and failure.
5. The problem with simplistic iterative process / Benefits of building a good model
There are problems with simple iterative approaches as well.
"Other projects use an iterative process, but they fail to build up knowledge because they don't abstract. Developers get the experts to describe a desired feature and then they go build it. They show the experts the result and ask what to do next. If the programmers practice refactoring, they can keep the software clean enough to continue extending it, but if programmers are not interested in the domain, they learn only what the application should do, not the principles behind it. Useful software can be built that way, but the project will never arrive at a point where powerful new features unfold as corollaries to older features."
And that my friends, is when a designer/developer really earns his dough - when powerful & valuable features, that are easy to implement on the codebase, become obvious out of the very nature of the model itself - often to the astonishment of the domain experts & users, who say "why didn't I see that... I never even realised that that is what our process was doing".
I'd better stop quoting now before I start running into copyright and fair use issues! I'll come back to you when I've digested the suggested processes and give feedback on that. Normally I would be very suspicious of any purported 'procedural approach' to software modelling, but I am so impressed with what I have read so far that I'm happy to believe that Evans can come up with the goods.
In the mean time if software development and/or design is your thing, you *really* need to buy this book.
Startup lessons from the ubiquitous Paul Graham [www.paulgraham.com]
One very interesting footnote is an important lesson for any old-school technology and business people to learn. The populace in general have begun to "get" this web thing - and it's not like distributing shrink-wrapped software or a broom, it's much more like real life or perhaps a reflection of it:
 A web site is different from a book or movie or desktop application in this respect. Users judge a site not as a single snapshot, but as an animation with multiple frames. Of the two, I'd say the rate of improvement is more important to users than where you currently are.
I don't like the title of lesson #7 (Don't get your hopes up) but I have learn't the value of this part of the lesson:
Startup founders are naturally optimistic. They wouldn't do it otherwise. But you should treat your optimism the way you'd treat the core of a nuclear reactor: as a source of power that's also very dangerous. You have to build a shield around it, or it will fry you.
The shielding of a reactor is not uniform; the reactor would be useless if it were. It's pierced in a few places to let pipes in. An optimism shield has to be pierced too. I think the place to draw the line is between what you expect of yourself, and what you expect of other people.
And not only in the context of being let down, but in the context that if you have extremely high standards you just can't expect everyone to be the same. If you find some people the same, happy days, but if you expect people to be who are not - well that's just an exercise in frustration and will reduce your productivity and theirs.
Paul continues with:
Shielding your optimism is nowhere more important than with deals. If your startup is doing a deal, just assume it's not going to happen. The VCs who say they're going to invest in you aren't. The company that says they're going to buy you isn't. The big customer who wants to use your system in their whole company won't. Then if things work out you can be pleasantly surprised.
Which I also know something about ;) It is especially important in that latter context of customers - when you're new and naive to business you can spend a lot of time on a few big customers who end up not being what you thought and in the mean time you have passed up the 50 annoyingly small customers who would have paid your rent, leaving you with nothing (sob).
The only way a startup can have any leverage in a deal is genuinely not to need it. And if you don't believe in a deal, you'll be less likely to depend on it.
So true. It works.
Safari crashes if you eval() a "really long string"
Something to keep in mind.
So I guess I have to do it the proper way and use the REST api at 23hq.
Freud pities the fool
Being an adult entails overcoming the difficulties and implementations of that which forms a personality. Fostering your desires. Putting up resistance. Asking why and not silently accepting everything. It is about standing up for that which is really important, with quiet determination.Holding perhaps an equal amount of insight is another article in today's SMH:
The TV Land network in the US announced that it will start I Pity the Fool, a series where The A-Team star travels across the country dispensing inspiration and advice.
"The 't' stands for talking," Mr T said.
"My show ain't no Dr Phil, with people sitting around crying," he said.
"You're a fool - that's what's wrong with you. You're a fool if you don't take my advice."
The only thing you might notice is that if you subscrive to BOTH my regular blog and developer blog (dev-blog) in a news reader, you will get two copies of each new posting (including this one). If you do, just unsubscribe one of them.
Deep C Secrets
What I found was Expert C Programming (Deep C Secrets) by Peter van der Linden.
It's a book on C that a Perl programmer could love! It's at times amusing, contains good anecdotes and puzzles to solve. What's more it contains the C code for a complete BASIC interpreter. No code contained in a book has been as cool since a UNIX book I bought in the early 90s had the MINIX source inside.
I didn't buy it since the inflated Australian tech book pricing meant that it exceeded this weeks discretionary budget, but I'll be back to buy it one day.
The amazon.com editorial sounds about right:
Defying the stereotypical notion that technical books tend to be boring, Expert C Programming offers a lively and often humorous look at many aspects of C--from how memory is laid out to the details of pointers and arrays. The author reveals his points through invaluable anecdotes, such as stories of costly bugs, and through folklore, such as the contents of Donald Knuth's first publication. Each chapter ends with a section entitled "Some Light Relief," which discusses topics (topics that some may consider to be "recreational"), such as programming contests. A fabulous appendix on job interview questions finishes the book.
Blog Categoriessoftware (41)
..heads up 'tunes (5)
..black and white (6)
..A day in Sydney (18)
..The Daily Shoot (6)
Book Review (2)