Saturday, 13 August 2016

Becoming a better programmer

You want to become a better programmer. First ask yourself why? Do you want to advance your career, get a better (paid) job, Get the acclaim if your fellow workers, or just out of professional pride? 

If you want to advance your career you are probably an employee or possibly a contractor. As an employee your options will be limited to what your employer thinks they need or you can persuade them they need. Some of the messiest code around is in enterprises where silver tongued employees persuaded their manager that they HAD to use the latest upcoming technology then left leaving code that uses half a dozen legacy technologies most of which are downgoing not upcoming so be careful what you wish for: you might end up maintaining it. Remember also that as you progress you will code less and less and plan and design more. 

Similarly if you want the acclaim of your fellow workers you will have to look at what they consider good and hope they did not fall into the ”if its popular it must be good” trap also known as ”Eat dung, forty million flies cannot be wrong”. Here too you may end up pretending to yourself that you like a language or technology that you hate and does not suit your personality. 

What does ”better” mean? 

This question has no right or wrong answer. As an employee ”better” could mean adhering more to company standards and producing maintainable low bug rate code fast. As a contractor it could mean delivering and not leaving a code smell behind. In either case issues of security, readability and scalability are important. Knowing when to trade one of these attributes off against another is a sign you are progressing. 

A professional musician, Classical or Jazz will specialise in one instrument. Few will tackle more than one instrument, though some Rock musicians are competent with several instruments while having one in which they are best. It is rare however to find a musician who has mastered Violin, Saxophone and Drums.
As a developer you can choose your main language and go deeply into it, plus one or two other languages which will shed different light on the problems you solve. In this context a better programmer understands their chosen tools more deeply as time progresses but experiments with other tools from time to time. 
The bottom line is you will become a better programmer if you deep dive into one language, though you may have to become a full stack developer as an employee and almost certainly if you start your own business. The bonus is that flirting with other languages will help you improve more by teaching you more about computing, just as a professional violinist may pick up something about music from playing the drums. 

Do things you cannot do. 
Don’t over reach but try things that scare you or you know nothing about. 

Train to your weaknesses: Security, Concurrency and Cryptography come to mind. 

Cryptography is hard and you will spend more time studying than coding and there are lots of pitfalls even implementing a secure algorithm. Everything you do that you could not do before makes you a better programmer. 

If you cannot think of anything new and innovative revisit your old code, solved problems and refactor, think how it could be done differently; for example Java 8 will allow a radical simplification of code, at least until the ”more complex is better” crowd get in on the act. 

Consider rewriting old solutions in a new language, if possible. Or think of three of four other ways you could have solved the problem. 

Another neglected way to improve is to teach. Teach children or adults with no computer experience. When one person teaches another two people learn. It could be a humbling experience but satisfying if your students overtake you and vanish into the stratosphere.

Wrapping it up
Becoming a better programmer means different things to different people. Before trying to improve examine your goals and motives and develop an action plan. Do things you cannot do, simplify code and design, redo things differently, preferably in several ways, and teach others.

Finally make sure you have a lot of time away from the keyboard. The best ideas and solutions come at the bus stop, at 3am (keep pen and paper handy) or in the bath (Train your memory). Make sure your mind as well as your body is away from the keyboard. This time is for incubation of ideas.

By the way:  if you dream you are coding you need a long break.

Thursday, 16 June 2016

Artificial Intelligence and The death of the software profession

Within our working lifetime we will be singing ”AI Killed the Coding Star”

Corporate consumer capitalism production (CCCP) imperatives will drive the adoption of software that can code to a specification and learn from experience. As AI evolves such programs will be tasked with improving themselves till they can accept and refine ambiguous and imprecise requirements from humans, determine the functional specification and architecture then produce a finished tested system. Finally AI systems will be able to decide business objectives, strategy and tactics better than humans. At this point humans will be redundant, at least in large corporations. 

The Death of the Software Engineer
Today’s software industry
The imperatives of corporate capitalism require producing more and more faster and faster with less and less.
In the digital technology domain the response has been to automate as much as possible, introduce methodologies such as Agile and Kanban and as in Slave Plantations of the 18th and 19th centuries, separate conception from planning and planning from execution, separate support functions from execution and delivery functions and organise these areas so that delivery of one project can proceed while planning for the next is proceeding and the next but one feature is being conceived. Like the ideal plantation this has the forward impetus of a mighty machine: One can hear the wheels turning. And, like the organisation of plantations and factories, it is very very efficient, regardless of how it may seem at the workface.
There are differences from plantations: Software development requires at least a minimal amount of thought from a developer though Agile planning is often implemented in a way that turns coding into a relatively mindless process, assuming the developer has some knowledge of the technologies they are using. This is like teaching a slave to use a slightly temperamental machine. Developers are thus infantilised less than slaves but the baby language of user stories suggests not by much.

Goodbye Developers
Development of AI threatens to remove the developer from the picture entirely. Consider a system that can receive a specification from a human, parse it and generate code, test the code and then deploy it. A reward system can be built in that motivates the system to produce an optimal trade off between various metrics such as bug count, performance and memory footprint.
It need not be as good as a human as long as it can learn. A system that does all this, far faster than you can read let alone type and works 24/7/635 with no diminution of energy, has a perfect memory and never makes the same mistake twice. Even if initially you are a better programmer how long would your superiority last? And what if the requirement is to improve itself?
Such an AI would be the perfect employee, It could not quit, it learns but does not rock the boat, it does not think inconveniently outside the box, it needs no sleep, pension or holidays, spends no time in meetings and, if it continually backs itself up, it will not go sick since the backup(s) will seamlessly take over. Humans will no longer be needed for this phase of software development. Good bye entry level developers and a large number of grunts happy to plough the same furrow (J2EE, Spring Etc) deeper. This was predicted a while ago And goodbye security engineers as other AIs will continuously try to find ways to penetrate the code then find ways to make it more secure.
It is only a matter of time before this and more happens. But when the exponential curve of development takes off it will be fast. Not lightning fast as corporate inertia, unwillingness to invest and unwillingness to write off legacy code for as long as possible will slow it down. Some may even make a virtue
of using humans, just like the brewery that advertised it was using the original traditional Victorian equipment with which it started: Till it burned down and then it advertised it was using the latest technology. This ’functional stupidity’ will delay the adoption of AI, though the first corporation to market such an AI will make a killing. Till one customer works out how to use it to make their own better version without infringing patents. 
Developers Regroup
My first manager ever used to say
If they make a little black box that does my job I’ll get a job making little black boxes
Developers will initially move to developing specifications for the code generators to use. Cue the Open Source Specification Generator. The next generation of AI will be able to take an imprecise high level specification, parse it for ambiguities and inconsistencies, ask questions, understand the answers, generate its own functional specification, implement it and present the ( possibly human) customer with the result for approval.
This does not need true AI: research in languages will improve detection of ambiguities and natural language research will improve the way the system communicates with humans. Humans may still be needed to work in areas where the problem may not be understood well. Again that will not last.
End Game AI Wins
In the end AI will eliminate human developers. Its domain will extend to architecture then eliminate large parts of IT support. When robots can unplug one component and plug in another there will be no need for humans at all. Line managers will have gone long ago, along with office politics and only the Business side will remain. Even Business Analysts will become digital. All that will be left is executives deciding the future trajectory of the business.
But even the CEO’s job is at risk. In the end a corporation will comprise machines producing goods and services for humans and other machines and
probably producing other machines. This will be the corporate wet dream: marginal production cost and no employee costs.
And it could be a nightmare as indicated in ?? Except for one little thing. Who will have the money to pay for these goods and services?
AI will by then have eliminated humans from a large number of industries and professions: driverless cars will have eliminated transport industry jobs, AI will have rendered doctors little more than the biological interface with the patient, robots will be carrying out surgery: Dentists may last a bit longer, Administrators will be defunct as their roles are routine and can be automated easily. Lawyers will be reduced to information gatherers and so on. 
Humanity’s response
A computer program can beat any human. A GO program is approaching the same level. But people still play chess and GO, perhaps, at the highest level with the aid of computers: people process these games in a different way from machines: see ??
Programming would be a hobby for some, a competitive sport for others and perhaps an art form, especially if new languages to enable this were developed: see ??
Or there could be another less pleasant scenario
The 0.1% who own all the wealth retreat into gated communities taking some servants with them leaving the rest to fend as well as they can without computers or the knowledge of how to find food. Outside the gated communities life becomes hellish and brutal for an indefinite period of time. Millions die. Inside the communities there are dictatorships and progress stagnates as the rulers seek to maintain the status quo and their power and luxury.
At the moment we are on a path that will mean the programmer becomes extinct. People who can communicate and work with strong AIs and preserve their sanity will be needed but they will be a minorith and coding will die as a skill unless preserved the way some calligraphers try to preserve the art of handwriting.
Most likely however the future will proceed in a way that is unforeseen at present.
Further reading
Superintelligence: Paths, Dangers, Strategies Paperback: Nick Bostrom, OUP Oxford 2016. ISBN-10: 0198739834

Thursday, 9 June 2016

Blogging With Latex

LaTex has some advantages for writers, but is not directly suited for blogging. The benefits come at the cost of a learning curve. Using Latex for blogging requires some technical knowledge unless a  LaTex editor is used and a slightly more complex publishing process.
The results of an initial experiment (this post) suggest it is worth persevering with Latex for blogging.
The author has for years prepared content off line using OpenOffice on OS X pasting the content into an online editor. One problem has been handling cross references, which OpenOffice displays in a fashion that is distracting, aesthetically displeasing and slows down proofreading and review.
LaTex  handles references more intuitively than OpenOffice (your mileage may vary) and the results can be pasted from the directly into an online editor.
Another advantage is that Latex allows embedding of metadata such as key- words, publication date and where appropriate invoicing data in the document as comments while not affecting the output document.
These advantages made it worthwhile experimenting with Latex for blogging.
One disadvantage is that Latex is very creative in err... creating files that should ideally be dumped in an appropriately named directory until they are needed which necessitates fairly strong disk clutter hygiene to become habitual
1 Requirements
A simple workflow is a paramount requirement.
Since LaTex takes care of formatting writers can concentrate on the words and formatting was preserved when content was pasted into the online editor. It proved enough to generate a PDF file from the source file and past the con- tent online. This is marginally more complicated than preparing content in OpenOffice and pasting it online. Occasional problems with justification and formatting in Open Office are avoided this way. Where required an Open Office document can be produced directly from the Latex source. Generally speaking an OpenOffice document will be needed only if the document has to be sent to someone, for example a publisher.
Latex users often handle references using BibTex. For blog posts this is a nuisance: references should be kept within the document thus avoiding disc clutter.
2 Solution
The solution was to use pdflatex in Texshop to generate a PDF file. For rapid processing of files that are almost completed and require only small changes a shell script was developed. The shell script hides the auxiliary files Latex produces, whereas TexShop does not and indeed produces more disc clutter.
Other editors such as Lyx and TextWrangler proved unsatisfactory. Lyx has a cluttered and counter intuitive interface and TextWrangler’s soft wrap facility did not  seem to work on OS X.
A LaTex command to handle references was developed adapting code suggested in a Stack Overflow post. At present this is stored in a known location and accessed using the \input command in the verbatim package.
3 Conclusions
Although one study [ Knauff M, Nejasmic J (2014) ] has shown that producing simple text ( that is text not needing mathematical or other complex notation) using LaTex is slower than a WYSIWYG word processor when adjusted for experience, writing proved more pleasant because of the uncluttered interface TexShop provides. This study does not match the experience of composing text rather then simply transcribing it, which might be more efficiently handled using dictation software and, subjectively, there was little impact on the time needed to produce the content.
The publishing process however proved a bit more complex. It remains to be seen whether the benefits justify this.
4 Further reading
Knauff M, Nejasmic J (2014) An Efficiency Comparison of Document Prepa- ration Systems Used in Academic Research and Development. PLoS ONE 9(12): e115069. doi:10.1371/journal.pone.0115069
LaTeX versus WYSYWIG:Usability, security and enjoyability 
5 Command to handle references
\newcommand{\sourcenamestyle}[1]{ {#1}  }
\newcommand{\sourcenamerefstyle}[1]{ [{#1}]  }
\newcommand{\source}[1]{  \paragraph{}
6 Shell script to create PDF file
# Create appropriate log folder
export LOG_FOLDER=<Path to log folder>/$1
# Create a pdf file  in the log folder
pdflatex -output-directory=$LOG_FOLDER $1.tex
# copy the pdf to the current working directory
cp  $LOG_FOLDER/$1.pdf  .
#  open the pdf in Preview.
open -a $1.pdf

Saturday, 4 June 2016

Capitalism, Slavery, Modern Management and Software Development

Cotton Planter and his pickers 1905
Capitalism is in denial over Plantation slavery and its role in present practice. These are perhaps most clearly seen in the software industry.

Slavery was a capitalist enterprise that gave birth to a number of modern management practices which arose as slave owners tried to maximise return from human capital using devices such as teams and infantilisation of slaves. It was an early example of neoliberalism with the market taking precedence over morality and religion.

Slavery's management practices persist in many areas of business, perhaps most prominently in the digital technology industry, especially software development. Teams have replaced field gangs and management seeks to subvert nominally empowering methodologies like Agile to turn programmers from artisans to work gang members by separating conception from execution, infanitilising developers and turning programming into an assembly line, thus rendering software development joyless to those who recall the days before the dotCom bubble. 
These practices survive because of technical, elite and political motivations for management. Management is (theoretically) a response to increasing organisational complexity, provides benefits for a managerial elite and, politically, provides a way to enforce discipline on workers. Management as a role appeals to the love of power in us all (political). As a career it gives entry to a perceived elite and generates then controls organisational complexity (technical)

The methods and practices of Slavery have survived but adopted subtler forms and become cultural norms that, like a chameleon are invisible until seen.

Examples, in today's software industry, of practices originating in Slavery are given.This is not to criticise the software industry in particular: it is just the industry with which the author is most familiar.

Modern software development methodologies such as Agile provide a way to manage complex development tasks, nominally empowering a development team infantilising its members. Teams are the analogue of the plantation Gangs, with architects perhaps taking the role of artisans and managers the roles of overseers (foremen in factories)

Slavery was Capitalism and slaves were capital.

Slaves were capital and slavery was capitalism.
This poster shows slaves were capital assets

Slavery is the skeleton in the cupboard of Capitalism and modern business culture but like a domineering patriarch its influence lingers well past its official death. Business is keen to disavow Slavery despite inheriting many of its methods. The whip has been replaced by more subtle controls such as peer pressure and the real programmer syndrome while Slavery''s practices and their underlying ideologies have, like oxygen become invisible.

One way to exclude capitalism from Slavery is to say that Wage Labour is a necessary defining characteristic of Capitalism, that plantations did not use paid labour and thus could not be capitalist Enterprises.

If Wage Labour is a defining feature of Capitalism then any one person business is not Capitalist since they are not paying anyone wages. Similarly a Farmer or Landowner who rents out property ( i,e capital) would not be a Capitalist.

Slaves were capital and Plantation Management tried to extract maximal returns from human capital. Plantations were therefore capitalist. In the early days of slavery when slaves could be worked to death and replaced fairly cheaply they were closer to human resources than human capital.

Plantation Management was modern management.

Cooke [1] describes three tests for an activity to be considered modern management
The model of a morden model manager .

  1. It must be carried out in a capitalist System
  2. It must involve activities that surpass a certain ( undefined) level of sophistication
  3. There must be a distinct group of people, described as managers, who carry out these activities

Since Slaves were Capital and used to achieve maximal return on investment, Slavery was a capitalist system.

Management of a plantation was a sophisticated affair involving complex divisions of many labour and techniques to control unwilling labour, namely slaves.

The manager role was carried out by overseers to whom the owner delegated control of a plantation. There were few salaried managers but many slave overseers. Overseers constituted a managerial class that controlled but did not, own the enterprise.

Plantation Management was Modern Management.

Modern management practices originating from Slavery

The capitalist pyramid or chain of command


Slave owners were concerned with profits and used advanced accounting techniques such as depreciation more consistently than many of the contemporary northern factories, often considered the birthplace of modern management. In some ways Slavery allowed a more scientific approach than the factories did since slaves could not quit and the owner could monitor them more closely than free labour, which would just have quit.
Many plantations used a standard accounting system described in Thomas Affleck’s Plantation Record and Account Books which contained advanced techniques, including how to calculate depreciation. By the 1840s planters were depreciating their slaves: Appraising their inventory at market value comparing with past market value to assess appreciation or depreciation, calculating an allowance for interest, and using this to work out their capital costs. And all by hand. Slaves were assets and exceeded the value of all other assets. The owners could have said “people are our greatest assets” but did not regard slaves as people.
Slaveholders developed a value unit called “the prime field hand.” Today's equivalent is the man-day or for machines, Horsepower. They defined the prime hand, by criteria such as expected production per day. Workers were measured against this standard and given values such as “half hand” and “quarter hand.” and this was a standard known between plantations. If a slaveholder said he had 13 hands who were the equivalent of 10 prime hands, other slaveholders would have known exactly what that meant.
A plantation was a complex enterprise with complex division of labour. Thus [1] a Virginia estate of six plantations, one visitor noted, had six overseers (managers), a general agent (director) and “staff” employees covering a traditional managerial trinity - financial resources (a book-keeper) literal human resources (two physicians and a preacher) and plant (a head carpenter, a tinner and a ditcher). Today the plant would be an architecture group and IT support. The visitor added “Every thing moves on systematically, and with the discipline of a regular trained army”. This is management's wet dream.
Managerial techniques included the application of scientific method, the selecting the best person for the job, monitoring performance, sophisticated organizational rules, a chain of command, team spirit, analysis of the maximum number of people an overseer could properly control, attempts to instil discipline and separation of conception from execution.

The Plantation as a Rational Machine

The inhumanity of plantations, factories and indeed the modern office arise from a view of the Enterprise as a machine, or at best a factory, rather than a community. This view may not have originated with slavery but was definitely present at that time. Cooke[1] confirms this with a quotation from Bennet Barrow’s Highland plantation rules: “A plantation might be considered as a piece of machinery. To operate successfully all its parts should be uniform and exact, and its impelling force regular and steady.”

This sentiment is echoed in modern software development: all members in a team, especially an agile team, should be interchangeable and velocity must be maintained and increased, generally at the expense of creativity and innovation and, in a depressingly large number of companies, with no concern for the workers.

Industrial Discipline

The plantation-factory-machine required military level discipline from its biological
Management is a bit more subtle these days.
components. The pace had, as with software development, to be fast and predictable and was never enough for the owners.

Slaves were obviously not focussed on surprising and delighting the customer and would resist efforts to make them work harder. As in the Stanford Prison Experiment force would work but would damage profits if kept up indefinitely so other tricks had to be used. Most of these can be identified in the later practices of Scientific management or Taylorism.
One tactic used was the Gang system which split slaves into teams of 10 to 20 people ( the modern consensus is that 9 people is about the maximum size of an efficient Agile team) . Every member had a defined task, and there could be sub-gangs for different tasks, and depended on the actions of other members. Work was organised to foster tensions and dependencies between gangs. The Gangs also developed a Team Spirit in which effort and commitment for the team, teamwork, was manipulated for slave owners ends.
Thus in a field gang when planting the fastest workers would run ahead and dig holes, the slowest would run behind and drop seed in the hole, and an in-between group would follow behind and cover the seeds.
The modern software development team may be considered as the equivalent of the Gang, the line manager taking the role of overseer and the team largely self policing. Team members are treated as commodities with the notion that the members should be interchangeable.

Separating Conception from Execution
Agile: don't think, just move faster

The Modern enterprise has senior management, like plantation owners and General agents, tasked with deciding what needs to be done plus originality and innovation, line managers tasked with delivery, lead developers tasked with planning and execution and front line developers who, like field hands, are tasked with delivery.
Separating conception from execution is part of Taylorist Scientific Management. Since elites have long realised that letting underlings think is dangerous ( one of the driving forces behind the decades long ruination of Education in the UK and UK and increased focus on Training) underlings have to be trained not to think, indeed they must be infantilised. The baby language of Agile user stories serves this function admirably.
Separation of conception from execution with managers doing all the thinking, in increasing detail as orders proceed down the chain of command, requires the worker not to think. At the developer level all that is needed is the ability to code to order and write tests. The manager and leads run ahead determining everything, the junior members fill in the code to order and the testing team run behind ensuring all is well. Then the code is loaded to the production truck and delivered to the customer.


Slaves waiting for sale: any resemblance to a Job centre is purely coincidental

Th modern manager who insists workers must
be interchangable ( and therefore faceless)
Modern Capitalism started largely with Slavery and gave birth to management practices that persist today. The Whip has been replaced with the PIP and teams replaced the Gangs. Teamwork and teamplayers are the modern mantra. Software development methodologies such as Agile which are intended to empower developers have been used to reduce teams to field gangs with team spirit and peer pressure making the job of the overseer/foreman now known as the manager, easier and exploiting team members care for each other to business ends.

Reducing skilled workers to digital field hands. Along time goal of management, seems to underlie much of the dysfunctionality of modern business. Like slavery these practices are very profitable, except to the workers, but come at a considerable human and social cost.

Research is needed to ensure that business can be profitable at minimal human cost. And to ensure management and business tools do not separate us from our humanity, as happened in the Holocaust when management techniques allowed camp commandants and others to forger they were dealing with humans.

Of course this may become redundant if AI eliminates the need for work as well as for workers to earn the money to pay for the goods the AI produces. 



  1. THE DENIAL OF SLAVERY IN MANAGEMENT STUDIES, Bill Cooke University of Manchester, IDPM DISCUSSION PAPER SERIES Paper No. 68, July 2002 available from
  2. Plantations practiced modern management Note that this research focussed on American Slavery and factories, There is good research to be done looking at the influence of British practices.
  3. A Renegade history Of the United States: Thaddeus Russell,Simon & Schuster 2011 ISBN 9781416576136
  4. The eerie similarities of slavery management practices to modern business
  5. The Stanford Prison Experiment by Zimbardo
  6. Capitalism and Slavery, Erik Williams, University of North Carolina press 1944
  7. Slave owners versus modern management: can you tell the difference.

Wednesday, 18 May 2016

The Software Developer Shortage Myth: a confidence trick repeated

The current developer shortage myth is merely a repeat of the 1960s myth of the shortage of technologists in the UK as the brightest minds, not being stupid, realised the best life was to be found outside the UK. There is evidence that the shortage myth is being manipulated to justify outsourcing of software production.

The Brain Drain and the White Heat of technology

In the 1960s the media panicked about the brain drain: British Scientists and Engineers emigrating to somewhere where the view that scientists should make everything except money did not hold sway. At the same time the great and good (lawyers, accountants, politicians etc) claimed Britain needed more Scientists and Engineers and referred to the transformative effects of the “White Heat of Technology”. Persuading young people to enter a career in STEM (Science, Technology, Engineering and Medicine) was a great way to prevent bright minds outside establishment families from aspiring to the positions of power and influence held by those calling for more people to enter STEM. Whether the country needed more STEM it seemed clear that industry and commerce wanted STEM the way they wanted the proverbial cranial aperture

Fast forward 50 years and there is an alleged shortage of software developers. When evidence to the contrary is presented it is dismissed or spun as a shortage of “good” developers - “Good” as in “What Business says it wants”.

In 1948 London Transport imported West Indians to do the jobs British people allegedly were unwilling to do at a wage the company was willing to offer. This eventually led to race riots in 1958. These were, rightly, suppressed. In the late 20th and early 21st century the Internet allowed outsourcing of many types of work to developing countries. Since this did not bring “immigrants” into Britain to “steal jobs”, always an incorrect term since jobs are offered not stolen, it was almost impossible that the job related race riots of 1918 and those of 1958 would be repeated

Today anecdotal evidence is being spun into a mythical shortage of software developers to justify outsourcing development to cheaper countries.

Even if there is a shortage of developers, the current (2016) drive to increase the number of people who can program will, by virtue of the law of supply and demand, polarise the software industry into a small number of near genius level digital technologists, a large number of commodity developers doing routine work and a few “seniors” and “leads” in between herding those at the bottom.

As in the 1960s, the public are being deceived into trying to enter STEM, today with a focus on digital technology, in order to benefit the “elite” of society.

There is no developer shortage

At the time of writing layoffs in Silicon valley supposedly the place where the demand for Software engineers is greatest, have doubled. Even allowing that the supply there exceeds the local demand, an excess of software talent in the supposed hub of innovation is surprising.

Also at the time of writing there has been a fall in the number of permanent technology opportunities in the UK but a corresponding rise in contract opportunities and, according to Cvtrumpet, competition for job vacancies is increasing as the number of employees who are looking for a new job has reached its highest level since autumn 2013.

This may reflect uncertainty over Britain's membership of the EU plus more generalised anxieties.

Developer salaries have not risen neither in real terms nor, it seems, in absolute terms, at least in the UK. London has the highest salaries and the employers least likely to compromise on their requirements, but London salaries reflect the cost of living there and are not qualitatively different elsewhere. Ten years ago an entry level developer could expect to command about £30,000 a year. Today entry level developers can command between £25,000 and £32,000 a year. This is higher than most non programming jobs but advancing beyond this can be difficult without job hopping, and requirements are more stringent: an increasing number of entry level jobs require a degree in computer science, and in practice this will mean an upper second or a first.

The most telling evidence against a developer shortage is that every advertisement for a developer job attracts several hundred responses. This level of response has led to automated CV scanning and rejection by software that simply ticks boxes: This saves the time of people in HR who used to tick boxes manually, but can be overkill: Some years ago a company received 25,000 applications for a run of the mill job and their scanning software rejected every one. Companies may have an exaggerated view of what such software can do, but the fact automation is needed to scan applicants strongly suggests an abundance of candidates. Against that is the increasing ease of making an application, though this trend is not confined to programming jobs.

But could it be that there is a shortage of “Good” developers?

There is no shortage of “Good” developers: Good is whatever business wants it to mean

A standard response to the question “is there a shortage of developers” is “NO, there is a shortage of good developers”.

The problem here is defining a “Good” developer.

At least one employer has specified candidates must have graduated from one of the top ten universities in the world. This is obviously someone's desire to boost their ego and must be mentioned then ignored.

The various definitions of “good” but they seem all to relate to what business says it wants. Ignoring experience and technical skills this seems to boil down to a love for problem solving, an analytic mindset, a passion for learning, a desire to improve and ability to work on a team.

Solving problems is easier than deciding what problems to solve: this meta problem is normally reserved for managers. An analytic mindset tends to conflict with creativity, another function often reserved to managers. A passion for learning is often reduced to “Passion for technology” with technology meaning programming, and a desire to improve is allowed as long as it does not threaten the power structures in the company. Ability to work in a team usually good but tends to reduce the best to the average and managers tend to distrust anyone who is willing to work on a cross team basis.

Once upon a time programmers rejected corporate values and produced remarkable things. Today programmers embrace corporate values and tweak existing products. Innovative products today are produced by those who have never been in a corporate environment or have left it in order to do their own thing. Unfortunately when people like this succeed they usually form a corporation and create a corporate culture just like everywhere else.

Business defines good developers as what they say it needs, a definition that could well exclude many pioneers of software development, and the developer community has embraced this definition. What business NEEDS may be neither what it says or thinks it needs or what it actually desires (compliant machine like employees who do not rock the boat but meet deadlines).

Why does business claim there is a shortage of developers?

The developer shortage myth allows manufactures to lobby for outsourcing

It takes years to learn digital skills.
The most cynical, and therefore the answer most likely correct answer, is that employers want an excuse to outsource work to developing countries and thereby reduce wage costs.

Fortune Magazine [1] looked at an analysis of the US H1-B visa system and found that only 30is% of the top ten requested positions when applying for an H1-B Visa lacked enough qualified American jobseekers to supply demand. Also it was common for employers to write requirements so narrowly that only one temporary overseas worker would fit. The pattern for this is a Jobserve advertisement early in the 200s which required a software developer who was a “ Graduate of an Indian university, fluent in an Indian Native language and had a complete understanding of Indian Culture”. The Fortune article does imply that employers are abusing the H1-B system.

In the UK in the early 2000s outsourcing proved less popular than actually importing workers from developing countries and exploiting them mercilessly. Unlike 1918 this did not provoke race riots. Some contractors on the shout99 website noted they had to spend half their contract training inexperienced overseas developers to replace them. One contractor reported workers employed on £35 a day and replaced regularly so as not to violate restrictions on time spent in the UK. At least one other company in Mainland Europe (which I will not name) did the same transferring workers from its overseas branches to a branch in Europe.

In brief the shortage myth was created to lower costs by importing lower paid workers keen to work in the US or by outsourcing development.


Current trends suggest there is no shortage of developers. Layoffs in Silicon Valley are increasing, software salaries are not increasing, employer requirements are becoming more stringent and the response to an advertisement for developers is so great that automated CV scanning is needed.

Long experience suggests the “shortage” is a myth designed to justify cost reduction by outsourcing and that the drive to increase the number of coders is a confidence trick that will ultimately result in shunning of STEM by the next generation.

If there ever was a shortage of developers it no longer exists.

The most likely resolution of the developer “shortage” will be the development of specialised AI that can produce code from a rough spoken or typed specification, perhaps learning from its mistakes. At that point the entire software industry will be dead, or at least meat free.


Saturday, 14 May 2016

Fad and Dogma Driven development

Fad and Dogma Driven development

Its all over the internet so it must be true – unless its in Wikipedia

Former Conservative MP Matthew Paris once noted that at his university science students dressed conventionally and tended to accept authority uncritically, at least outside their specialist area. Unlike arts and humanities students they were not exposed to critical thinking, linguistic analysis or trained to question authority. Outside science this made them easy prey for politicians and other manipulative characters.

Programmers were always dogmatic, and, like scientists, tended to be socially conservative and pontificate, often ignorantly outside their field. They also tended to be a little unconventional, and sometimes creative in areas like writing, music and art. Nowadays however programmers, at least the younger ones beloved of big corporations tend to be corporation people and equate internet popularity with goodness. The need to constantly update skills and keep up with the constant flow of new frameworks, libraries and technologies means the more articulate and persuasive developers, or those with the loudest voices and desire to put “cool” technologies onto their CV get their pet technologies adopted, usually without proper evaluation by architects.

The result tends to be an industry driven by fads fossilised in legacy architectures infrastructures and code bases. Struts, EJB, TDD, BDD, MDD and others. Adopting technology without proper evaluation is suboptimal and leads to technical debt down the road. On the other hand fad driven development does stoke the demand for developers.

Kuhnian and Popperian software development

Philosophers of Science tend to be either Popperians, who emphasise how theories can be disproved by a single observation and Kuhnians who argue that scientific communities develop Paradigms - explanatory conceptual frameworks - and dismiss observations contradicting the dominant paradigm, tweaking the paradigm till it becomes untenable and a simpler paradigm arises. For example the planets were once supposed to orbit Earth in perfect circles. Observations disproved this so circles around the circular orbits were added. Eventually a new paradigm arose: Planets, including the earth orbit the Sun in elliptical orbits.

At a high level and ignoring any subtle nuances Popperians seems to be describing how Science SHOULD be done and Kuhnians how Science IS done. Kuhnians seem closer to an anthropological and sociological description of science and to real life. In a nutshell I view Popperians as prescriptive ( a dictionary tells you how a word must be used) and Kuhnians descriptive ( A dictionary tells you how a word was used when the dictionary was compiled)

Currently accepted silver bullets for software development include Agile, Test Driven Development, Feature Driven development or Model Driven development. More cynical notions such as Dogma Driven development and Politics Oriented Software development tend to describe the reality of software development more closely.

In technology paradigms are successful fads. Once a fad, whether a management one or an engineering one, gets enough mind share it percolates through the company and possibly the industry regardless of merit and last till the next fad appears.

How are technologies adopted?

A process or technology is adopted by a company when proposed by an influencer with interests that may not be those of the company. They may want the project to fail in order to move to another project or to be seen as the near saviour of a doomed project. Or they may want a new skill that will help them find a new job. Perhaps they may even want to make their life easier.

If the technology is popular on the internet it must be good, just as eating dung is good because 40 million flies can't be wrong or a brand of soap must be good because a top footballer is endorsing it.

Of course the proposer does not phrase it that way instead they plug short term benefits to the company, hoping to be out the door before the problems arise, or simply not seeing future problems. This bears out the observation that the stupid man is considered brave because they do not see dangers and decisive because they do not see alternative. As on the business side things tend to be driven by those with the loudest voices, strongest opinions and strongest bladders. If they are also respected developers they have an easier task. The late C. Northcote Parkinson's descriptions of how committee chair persons drive their own agenda over riding objections in multiple ways are still relevant here even though the context is a scrum meeting rather than a boardroom.

Since most managers prefer to see confidence and a single decision rather than alternatives technology adoption tends to be driven by smart fools though Security or Architecture may block some bad choices.

Popperian and Kuhnian fads

Test Driven Development, Behaviour Driven development and Model Development are loved by managers since they prescribe behaviour and are in essence Popperian – they say how something should be done. However there seems no non-anedotal evidence they actually speed up development or result in more robust, maintainable, future proofed code.

The Kuhnian processes, for example Dogma Driven Development and Politics-oriented Development, to which we can add Fad driven development, are descriptive describing how things ARE done. Being driven by human nature they will outlast any prescriptive fad but need to be researched by sociologists and anthropologists rather then engineers. Agile is a little transgender in this classification as it is basically Popperian but can be, and often is, implemented in a way that is susceptible to being hijacked by Dogma Driven Development and Politics-oriented Development or turned into a management tool to kill that inconvenient spark of creativity and innovation which occasionally arises.

The impact of Fads

Fads are adopted in the technology industry by developers who are rightly afraid that not adopting a fad that is likely to become fashionable will leave then trapped, dependent on their current employer – who is likely to be happy to miss technical fads, thus giving them more power over their teams, as long as their own pet management fads are implemented.

Technical fads and management fads do not always coincide and rarely cause terminal problems for the enterprise: If they cause terminal problems management pays the price only if the enterprise goes under before they can bail out.

Fads are not peculiar to the software industry: earlier decades had Pokemon, pet rocks, the video baby and software programs ( and I cannot even recall the name) that acted like pets and required constant attention or they would “die”. Politics has neoliberalism and Religion adopted pop music for a while.
The wrap

Software developers tend to be socially conservative and more likely to accept authority and possibly more susceptible to peer pressure, even if the authority is only repetition of unproven and untested theories on the internet. Having chosen a fad they then defend them with religious fervour.

Managers must ensure that technologies are not adopted because of temporary fads and that a proposed new technology is properly evaluated. Managers should bear in mind that programmers are dogmatic, socially conservative, equate goodness with popularity and assess the truth of statements by frequency of appearance in search engines and that network effects, citations and lack of independent verification mean that the Wisdom of Crowds is not a good criterion to use when choosing a technology.

  1. Politics-Oriented Software Development: TheophileEscargot originally at but no longer available

Saturday, 16 April 2016

Why administer coding tests?

Coding tests are becoming more and more common in filtering out applicants. However it seems to me that employers are not thinking about what they want to learn from the candidate's approach to the test: Increasingly sites like Codility are being used as initial screens. Using such a test the interviewer cannot see the candidate's thought process or their coding style. What is more such tests, as they have a time limit, create almost as much stress as coding on a whiteboard in a face to face interview.

All I need is the right coding test and
The right candidate will pop out.
In my view a coding test should be used to answer at least the following questions

  1. Is the candidate's understanding of English (say) good enough to let them do the job?
  2. Does their coding style fit the company needs and/or culture of their prospective team
  3. Are they a “get it done” person or a “get it right” person.
  4. If they are unable to complete the test would they be able to perform in the workplace if mentored
  5. If they do not complete the test does this indicate an inability to handle stress? If so does it matter?

And this cannot be achieved with remote timed tests administered by a third party.

One simple test is to ask them to write fizzbuzz, whereby for integers between (say) 1 and 100 inclusive the applicant should produce code that will print “Fizz” if the number is a multiple of three, “Buzz” if the number is a multiple of five and “FizzBuzz” if the number is a multiple of both.

As an exercise I programmed fizzbuzz using a Java 8 stream in order to further internalise the Java 8 functional paradigm. I started with a slightly different algorithm to the obvious one that first came to mind and, in the spirit of a timed test, reverted to the simpler algorithm at the first sign of difficulty. This is another risk of a timed test: the candidate will run with the first solution they think of and this may be neither optimal or creative. There is a universal tendency to think of a superior solution two minutes after the test ends.

Another problem with tests is that a candidate who performs well may be a details person to the extent that they cannot see the wood for the trees and, once in post, will miss design problems in your codebase

David Saintloth [1] notes that FizzBuzz type tests test knowledge of the minutia of a given programming language and that syntax errors in coding are often punished harshly even if ti is clear the candidate understands the algorithm they wish to use. Further he says

fizz/buzz tests ONLY test programming (and in that not very well either). What people want to hire though are engineers not programmers....but they test for programmers, job requests are programmer focused (listing a flurry of buzzwords) which leaves potentially brilliant engineers out in the cold.

Ericsson [2] notes that aptitude tests, and coding tests are a form of aptitude test, predict short term performance but not long term or even medium term performance. Coding tests pander to the tendency for developers to focus on tiny details rather then the big picture. As an example I noted in a post on code reviews [3] that a comment in [4] on how code reviews focussed on a minor and irrelevant aspect of the code was followed by a large number of comments discussing the code presented totally hijacking the discussion: this is a bit like judging a painter by a single brush stroke rather than the picture as a whole and how to hold the brush rather than what do to with it.

Some people say that rather than coding tests candidates should be asked to show code samples or discuss projects they have undertaken. These solutions have problems. Code produced for an employer or client is often covered by a non disclosure agreement and some developers deliberately refrain from spare time coding for a number of reasons: for example to avoid burnout or domestic friction, to study other things, architecture for example or something totally unrelated to technology. The argument that such programmers are not “Passionate” about technology is specious, involves the “Real Programmer” syndrome and that the developer is, or should be or become autistic as a result of obsession with technology. This of course benefits managers trying to squeeze more out of their resources before throwing them away when they burn out [5].

In Brief

If coding tests are to be used the employer must decide what they want to learn from these tests rather then just applying them as a silver bullet to fix the probably unsolvable problem of picking if not the best candidate, then one capable of doing the job. It is possible that in the future AI techniques will be better able than humans to pick the “right” candidate but such an AI could and probably would, pick another AI to do the job.

More seriously filtering candidates on the basis of automated or semi automated screening not only risks leaving out potentially brilliant candidates but ignores the heuristics that cultural fit is more important than technical ability, that technical weaknesses can be trained out and that a hire who turns out unable to do the job for which they hired may do brilliantly in another role and that may only be found out after they are hired.

There is a role for coding tests but using them to screen applicants automatically suggests that the fabled shortage of developers is indeed a fable: If there were a real shortage employers would be willing to take on and train candidates they now reject.

Further reading

  1. Ericsson,Krampe and Tesch-Romer: The Role of Deliberate Practice in the Acquisition of Expert Performance: PsychologicalReview 1993, Vol. 100. No. 3, 363-406
  2. Ways to make code review not suck
  3. Karojisatsu: Mental health and suicide risks posed by IT culture for Managers and Developers