Sunday, 31 May 2009

Net Present (Lack of) Value

Once again I sat through a meeting about NPV calculations and the assumptions behind them. This time NPV's are being used to calculate sales bonuses. Huh? You assign the loan to the gal who sold it: every month the loan instalment is paid, she gets a percentage, just like a regular salesman does. Who needs NPV's?

There is one circumstance when you do. That's when you want to sell a commercial building with a known and reliable rental income from a tenant who isn't going anywhere until the end of the lease. Then the NPV of the cash stream is the capital sum you would need to invest now in a quarterly-paying bond at the discount rate you chose to generate a cash flow with the same NPV. (Quarterly because commercial rents are usually payable on quarter-days.) And that is only possible because the cash flow from a commercial lease is basically the same as from a bond.

In this example, the forecast of the net income stream can be done accurately and has a high degree of probability of being true; while the NPV itself has a real interpretation, as the value of a bond, and as the purchase price of the building.

Now consider a mortgage or personal loan. This has a net income stream, payable monthly. It is much more difficult to forecast: loans are closed early, the default rate is very high (banks make loans to people with a 30% chance of defaulting and think it's good business – the people who financed your neighbour's sofa live with default rates of 50%) and the timing of a repayment or default is erratic. The bank can calculate averages, but the variances are high and the tails correspondingly fat.

But here's the real difference: for a commercial building, the net income stream repays the capital amount, but for a loan, the capital is repaid by the gross income. So if you discount the net income from a loan and call that the NPV, whatever meaning it has, it is not the amount you would pay for the loan. If you wanted to own the net income stream from the loan, you would have to buy the whole of the outstanding capital at the time of purchase, less a discount equal to the expected default on the remaining term of the loan. So if the discounted net income isn't the purchase price, what is it? It turns out to be the amount the shareholders would need to invest in a monthly bond offering the rate of return used in the discounting calculation to yield a cash flow with the same NPV as the loan. So it's the value now of the net income stream to the shareholders. It's what the loan is “worth” to the shareholders.

Or at least that's what everyone around me keeps saying. It sounds convincing. Except it isn't what the loan is worth to shareholders. What the loan is worth to shareholders is the contribution it makes towards their annual dividend payments, and that is measured by the gross margin (net income minus variable costs) of the loan taken year by year. As a rough guide, most of the value of any reducing-balance instrument is in the first half of its life.

Do not ask me why they calculate the NPV – I suspect it's because they're in banking and calculating NPV's is what you're supposed to do when you're in banking, like taking ecstasy when you go clubbing. That the discount rate they use is around 12% - good luck getting that in the money markets – makes the whole exercise silly.

So aside from the fact that it's the wrong measure, what else is wrong with using NPV's? It's a distraction from the job of finding the right measure for the job; it gives them the undeserved feeling that they are being sophisticated and clever; and it soaks up analysis time doing monthly lifetime factorisations of all sorts of things that should be left as totals: defaulting, bad debt, early closures. It creates an air of utterly spurious accuracy. But hey, get with the programme. This is, after all, the same industry who thought that lending money to people with no incomes was a good idea.

Thursday, 28 May 2009

William James - Part One

"...if you want an absolute duffer in an investigation...take the man who has no interest whatever in its results; he is the warranted incapable, the positive fool. The most useful investigator...is always he whose eager interest in one side of the question is balanced by an equally keen nervousness lest he become deceived." (William James, The Will To Believe)

I've been reading William James this week. A very long time ago, when I was a teenager discovering philosophy and libraries had Real Books in them, I tried to read James' Psychology, but I don't think I got very far. I'm reading the Pragmatism and Other Essays Penguin Classic and I have The Varieties of Religious Experience in the stock-cupboard.

Pragmatism is the book A J Ayer modelled Language, Truth and Logic on - the use of a single simple principle to cut swathes through metaphysics, morals and pointless disputes. I'm not too sure they don't cover much the same ground. James's bluff, breezy, conversational writing style influenced the authors of the "Big Book" of Alcoholics Anonymous (a work with which I am very familiar). The idea that we weave new information and experience into an existing web of belief and knowledge, possibly rejecting it if it doesn't fit, is one of Quine's Big Ideas - though Quine's formulation is usefully more detailed. It's also been travestied as "the coherence theory of truth".

Anyway, I found the quote rather apt in the light of my interest in mistakes. I'd slowly come to the realisation that one reason I didn't see wrong numbers is that I had no expectations as to what the numbers should look like - I wasn't looking for a particular result, and I wasn't looking because I didn't care about the result - I was more interested in the method of getting there and how to get there more quickly next time. So mistakes just sailed on straight past me.

Being "the warranted incapable" is not such a good position for an analyst to be in.

A couple of photographs of London, snatched during an inter-office trip...

Seven Dials, Covent Garden in the afternoon



St Paul's Courtyard, lunchtime



By the Tate Modern...





Wednesday, 27 May 2009

80 / 20 Development

That old Pareto rule applies to software development: you can get 80% of the user's desired functionality for 20% of the effort needed to produce a proper application. One iteration of the rule means that you get 96% of the requirement for 36% of the total effort, which is pretty much as far as most people go. Especially when the application is being developed in Office.

There's an excellent essay called From Windows to Cocoa in which Peter Bright describes three kinds of programmers. The first kind are really business analysts, the second are the guys working your company's big programs, and the third are the real craftsmen. Here is Peter's description of the first kind:

...basically business analysts; they're using Access or Excel or VB6 to write data analyzing/number crunching applications. These things are hugely important in the business world, totally unexciting to anyone else, and the people writing them aren't really "programmers." I mean, they are, in the sense that they're writing programs, but they're not especially interested in programming or anything like that. They don't really care about the quality of the libraries and tools they're using; they just want something simple enough that they can pick it up without too much difficulty. They'll never write the best code or the best programs in the world; they won't be elegant or well-structured or pretty to look at. But they'll work. Historically, as I said, these are the kind of people who Access is made for. Access is a great tool, quite unparalleled. Sure, it's a lousy database engine with a hideous programming language, but the power it gives these people is immense. So Access and VB6 and Excel macros are where it's at for these guys. ”

That's pretty much me and all the people around me. Except there are two levels within this first group. There's the 80/20 developers and there's those of us who try to write robust applications. What's an 80/20 developer? He's the one who writes 80/20 applications.

An 80/20 application dumps you into the middle of the VBA editor with an error message like “Get Range method of current object failed”. When you call the guy (have you ever met a girl who cuts VBA?) who developed it, he says: “You should have removed the “?” from the data” or “Oh, you can't do that with this, it doesn't make sense”.

An 80/20 application has user forms where all the buttons are active and have names that don't quite describe what they do. It is possible to push one of these buttons and get obscure error messages and not be able to get back to where you were. When you call the guy he says “you can't press that button until you've loaded the csv file and entered the magic password. I did explain that.” That's not how a user interface is supposed to work.

An 80/20 spreadsheet has range names that don't describe what the range is for or has in it; it has un-commented code, functions that replicate something that there's a method in the object model to do, non-descriptive variable names, no error-trapping, no exception-handling... and it's hell to maintain.

Sure the 80/20 application does what the users want it to do. Almost. If treated with kid gloves and fed only carefully-prepared data.

For reasons I don't understand, people think that just because it's Microsoft Office, they don't have to read anything. There is more to learn about the Excel object model than there is in the usual first-year undergraduate course of mathematical methods. Learning Java with all its main libraries as well as finding your way round Eclipse or NetBeans will fill up more space in your head than a geology degree. There are 700-page books full of the tricks and annoyances of Excel, Access and Word.

You need time to develop an application which will let you find your way round the object models. That's what you don't have. Bash this report out, run that simulation, cut that query, attend this meeting, make up that presentation. Busy work that benefits nobody. We're not here to learn things, we're here to fill in forms, work processes and make the company money.

Well, you can't do what you don't know how. You can't be innovative if you don't get time to think, read and talk with other people. You can't do things smarter if you're permanently dumb.

Steve McConnell's Code Complete is pretty much everything most people will need and want to know about good programming style. Buy, read, enjoy and digest. Even picking three things from it to improve your programming will help.

Tuesday, 26 May 2009

The Philosophy of Mistakes: Interlude

I'm going round and round in circles on the subjects of mistakes and testing. As ever, it's because I'm arguing with a voice in my head that doesn't belong there. It belongs to a previous manager I had who had a very low level of technical skills and knowledge and paranoia way into the red zone. What he wanted was the assurance that nothing would ever come out of our pricing engine that would ever cause him embarrassment. Nothing. Ever. But it all had to be black-box testing, because if he (and others in the team) looked inside the box, they would not really understand what they were looking at.

Now everything I have read on software testing says that the vast majority of mistakes and omissions are found by a thorough code review conducted by two relevantly skilled people. Black box testing is a very bad way of finding mistakes. Why? Because you don't need many input variables with a reasonable range of values before a “perm all the values” test suite becomes a billion or more records long. You can't construct it, let alone have the time to run it. So you have to run a subset – a very small subset of maybe a hundred thousand records. What are the odds that a random flaw will be in that subset? That would be less than 0.01%. This is not going to work. Plus, you have to construct the test cases, which means you need an engine to produce the “answers” for the cases – in software, since no human being is going to be able to work out the answers to more than about a hundred cases accurately – so you get stuck in an infinite regress of checking the checking.

What, in other words, my paranoid manager needed to do was sit with the programmers and get a detailed walk-through of the code behind the pricing engine and how well they had foreseen problems and dealt with them. He would then have an idea of where specific problems might arise and what he should be testing for. But no, that was never going to happen.

Testing goes back to the days when people tried to pass off low-grade precious metal currency on each other – so you needed to test that you really were getting a genuine shekel or twenty-carat gold or whatever. Testing for that is simple. So is testing that the latest batch of cannon aren't going to blow up all over your soldiers – that's why it's called proof-firing (on the Abderdeen Proving Grounds). Any gun that survives a dozen shots is unlikely to blow up later – it's in the nature of the beast. As the system gets more complicated, so does the testing. In the end, you have to accept that if you're going to have anything as complicated as a nuclear submarine or a Boeing 747, some things are going to be less perfectly finished than others and it may even sail or fly with a couple of non life-threatening snags. That's the nature of the real world. The same goes for software – though the desired standard is more XP SP1 than Windows Vista. Perfect, error-free programs are either very few if complex, or very small if plentiful. What we get in ordinary business applications is going to be well short of perfection, unless it's fairly simple.

Testing is and always was there to prove that under normal circumstances your widget works, not that under some weird conditions it doesn't. If it's a mechanical widget, those weird circumstances may not be too hard to find – too hot, too cold, too much dust, but who foresaw “the wrong kind of snow”? If it's a software widget, those weird circumstances might be there, but may never occur. Software isn't like a fugaze coin – which shows up bad when it tries to be good – rather, software can work just fine, except when...

And no amount of black box testing will ever find it. What you need is sensible peer-review. Like that's going to happen in a modern organisation where everyone is a single point of failure.

Monday, 25 May 2009

Which Lady Is The Tramp?

The other morning on the commute I was listening to the Pure Jazz Moods 2-CD (it's the commute, you're allowed to listen to anything that puts you in the right mood) and on came Ella Fitzgerald singing The Lady Is A Tramp. And with the last lines “I'm alone when I lower my lamp / That's why the lady is a tramp” I realised what a neat bit of ambiguity Rogers and Hart had pulled off. You're supposed to think, well, I always had thought, that it's the singer who is the tramp, because she likes the beach at Coney Island, can't eat late, doesn't play craps with barons and earls and is a “hobohemian”. Except she turns up on time for the opera, stays awake all the way through, actually “reads every line” of Walter Winchell, doesn't do bitchy gossip (“won't dish the dirt / with the rest of the girls”), is quite happy with boating on Central Park (as opposed to a yacht off Newport) and when she goes to bed there isn't a lover there. None of which is true about “ladies” - so it's the Lady who is the tramp, because her manners are fake, rude and expensive. In fact, the Lady is a moral tramp, while the singer is only a social tramp.

It's end of the Bank Holiday - deep thoughts are suspended for a day.

Saturday, 23 May 2009

Countdown to July

It does not matter how many times I sit through a re-organisation, the waiting does not get any easier. You know they are plotting to get rid of everyone they don't like and who doesn't fit in, while lying about how it's all about organisation, skills and office locations.

I'm worried about losing the salary, of course. The possible eighteen-month job search is not something I'm looking forward to either. But I have some money – even though it's supposed to be for my pension – and I know I'm not going to lose my mind or identity being out of a job. Not out of work, my work is as a philosopher, logician, human being (washing, cooking, keeping a pleasant house – this is work) and all-round creative person. When I have a job, I'm usually out of (my) work.

What is really irritating me is the thought of having to listen to a load of patronising bullshit, while the HR department “support me in my transition” and explain that I shouldn't take it personally. This is claptrap: if it has my name on it, it's personal. The only things that aren't are circulars addressed to “The Householder”. The moment they tell me, I want out. No handover (what's to hand over – if they don't need me, they don't need what I do or what I know), no hanging around: give me the Compromise Agreement, the breakdown of the severance payment, structure it to be tax-free and I'll sign and return it as soon as my solicitor says it's okay. Otherwise, I'm outta here. Pack my case and go – right then and there. Before they can even start their box-ticking speech.

I am fifty-five years old and this is the fourth time I will have been made redundant in July. My pensions were worth not much last year and even less after what's happened to the markets since. You don't want to know the unemployment figures for men over fifty-five. They want me to sit still while they run through a speech that makes them feel good about what they're doing: they need me to sit there and nod along, or they won't feel justified in themselves. The hell I intend to give them that satisfaction.

Walk around our offices and you would not know all those people think they have a good chance of losing their jobs. Management say they think it's because we're all being “professional”. They don't think it's anything of the sort – they know it's because everyone is in denial. It hasn't happened to some of them, so they don't know it's real yet. They still think they are needed. They have yet to learn that no-one is needed, that everyone is fodder and management think that whatever you do can be done better by someone else or didn't need to be done at all. It's after it all happens that the mood will turn sour and the “professionalism” vanish. By then, I'll be gone.

While I have had myself “out there”, I haven't been that vigorous in finding something. Now I have to go through some motions – not because it will result in a job offer, but because it will make me feel like I'm being pro-active. We're going to hear in the next two-four weeks and I'd like to be running when the ground hits me. Maybe it was just about okay to believe that I had a job in the new organisation until last week, but now it isn't. The odds have swung. If I leave the serious search any longer, I will feel bad about myself.

That's what this is really all about. It has nothing to do with them. It has to do with me: I'm not acting as I need to be. I have to not “judge myself mercilessly” but to start the work. If only I didn't hate job-hunting. And it's only just hit me why. You, gentle reader, may think that the next job is going to make your life better and be fun or interesting or give you lots of travel or whatever it is rocks your boat – but I don't think that about my next job. I can't make money from my work, I can only make it, as so many of us have to, from my job.

On Being an Analyst In A Bureaucracy

For reasons beyond me, people want to be managers rather than analysts. I regard the title 'analyst' as a badge worth wearing, whereas I'm not so sure that 'manager' is really worth it – these days, managers in large organisations are little more than bag-carriers for the senior guys, “messengers, sent by grocery clerks, to deliver a bill”, as Colonel Kurtz describes Captain Willard in Apocolypse Now.

There are jobs with 'analyst' in the title which are more concerned with processing, say, prices for commercial accounts from the salesman's proposal into the computer systems – these are really administration roles. Then there are the poor bloody infantry sorting out the errors in vast databases - these are data administrators, and no less valuable for that.

The role of an analyst is to source, interpret and report information and provide an informed view on what that information means for the future. It's to dig into the numbers and then assemble a picture: which is both analysis and synthesis. If this is done at all now, it's done by top-flight investigative journalists when they write books, and maybe a handful of stock-pickers in the financial industry.

That can be difficult to do well or with integrity when the information and evaluations are contrary to the chosen aims and ambitions of the executive. CIA analysts had this problem in a big way with the Bush administration. An analyst whose interpretations don't fit the current policies is usually told to get with the program: in most organisations, raising money or gaining management support is everything, and figures and anecdotes are plucked from thin air to support the policies.

You can't be an analyst in a bureaucracy. Bureaucracies are inherently political, and politics is the opposite of honest, creative scientific, artistic or technological endeavour. Managers in bureaucracies do one of three things: push initiatives they believe will advance their careers, resist changes pushed by career-advancers, or act as the willing servant of their senior managers. Everything they do is with a political end in mind, everything has an ulterior motive, which is why their annual appraisals are dishonest and hypocritical. Bureaucracies are not knowledge organisations: knowledge implies truth and an authority independent of rank and personality. That is incongruous in a bureaucracy: a manager does not have to be right, they just have to be senior. If your Director says that making loans to high-risk individuals so that you can book the up-front insurance premiums and a bunch of income in the next three months is the way to go, and damn the bad debts later, then by golly, that's they way all the under-managers will go and it's the song they will all sing.

Bureaucracies are not skill-organisations: skills, technical ability and expertise, have the same implication of truth and independent authority, and are therefore anathema. That's why big organisations don't train people in transferable, market-valuable skills: not because “they might leave” but because then those people would know something that could not be contradicted by management to suit some bogus policy.

An analyst needs seven things: knowledge of their industry and market; a broad general knowledge of economics and demography; a grasp of technicalities from statistical analysis through cost accounting to SQL query writing and programming; the ability to present information clearly and concisely; a clear-eyed understanding of the way the world works; a grasp of the principles of formal and informal logic and of epistemology. Oh, and a sense of humour.

Bureaucracies take analysts and reduce them to jargon-spouting, cliche-scribbling number-crunchers, because the value is in the organisation, not the people, so the people must not have unique skills and knowledge. If there is a need, such as in IT, it must be devalued as much as possible by being outsourced and off-shored. The only thing of value to the management of a bureaucracy is the ability to source and present figures that support their positions.

What I'm doing working where I am, I don't know. Oh, yes, I do. The location is one of the best in the world. No kidding.