About Tom Evslin

Video Profile of Tom Evslin

Follow Tom Evslin on Twitter


subscribe:

Add to Technorati Favorites!
Powered by TypePad
Member since 01/2005

technorati


« If You Liked FEMA, You’ll Love BellSouth | Main | Follow the (Oil) Money »

Why a Great Programmer is Worth Fifty Good Ones

Great programmers are orders of magnitude more productive than merely good programmers.  It’s like basketball players.  Michael Jordan isn’t just 20% or 50% or 100% better and more useful to his team than you or I would be; he’s simply in another league (or was).

The best thing you can do to make your technology-based company succeed is hire (or outsource to) a great programmer.  Second best thing (Mary might disagree) is get a great marketing person but that’s another topic for another day.  Today’s post is about the person who will build the better mouse trap for you.

Question came up when I was consulting to a development manager at a tech company.  He told me that the CEO, his boss, wouldn’t give him the salary and option freedom he needed to close a great programmer he’d found.  Salary would have been 20% above what he had approval to offer; and, thanks to the new accounting standards for stock options, he didn’t have the authority to offer options.  He lost the potential new hire and had to settle for someone merely “good”.  Ironic thing is that he had several open positions so, once he gets through hiring several people, he’ll end up paying more in the aggregate than he would have paid for the superstar – and probably won’t get as much productivity.

I suggested that he make a deal with his boss:  put a cap on total budget but not individual salary.  Let him, the manager, decide whether to put his budget into one or more superstars or a gaggle of good players.  Alternatively, promise to produce more with the same budget if only he can have freedom from salary caps.  He promised to take my advice but too soon to know if it’ll work out.

But why is a great programmer worth fifty good ones?  It all has to do with the interfaces.

If one person can do a whole project, there is a whole layer of complexity, documented interfaces, and misunderstandings that is eliminated compared to having two or more people working on the project.  Meetings don’t have to happen; schedules don’t have to be synched; joint-testing and finger pointing over faults doesn’t happen.  No personality conflicts, either.  The actual amount of work required to do the project is much less when it is being done by one person.  Also management time is reduced.

So, suppose your typical project (I know there’s no such thing) needs to get done in a month.  If you have programmers who, working alone, could do this project in two months and the deadline is real, my guess is that you’ll have to assign four of them plus a manager to get done in a month.  Such is the inefficiency of multiperson projects.

But, if you have a superstar who can do the whole thing in a month, you only have to assign the superstar.  You also get better odds of actually finishing on time AND get better, more maintainable code.  Now how much more is the superstar worth than the merely “good” programmer?

But there is more to the productivity gap than just avoiding multiperson projects, as important as that avoidance is.  A great programmer, working alone, is usually orders of magnitude faster than a good programmer also working alone.

If you’ve ever had a superstar working for you, you’re probably used to leaving him or her behind in the office when you go home – if you’re the CEO of a startup you don’t go home that early – and finding him or her in the office when you arrive.  If it’s a him, the stubble’s a giveaway; women cover up a little better.  A superstar prefers to work in at least twenty-four if not forty-eight or even seventy-two hour spurts.

Again, it’s the interfaces: it’s very expensive to put something down, go home, sleep, wake-up, and then recover all the loose ends. The first half of writing a program creates almost all loose ends; the second half is tying them back up.  Like surgery. If a programmer has the skill and physical stamina for marathon gulps of creativity, then there are many fewer times when the non-productive work of putting stuff away neatly (mentally) and picking it up again has to happen.  For similar reasons, one-person debugging is best done when you can crawl inside the code and stay there until you’re done.

Now, unfortunately, this kind of endurance is a young person’s game.  Michael Jordan isn’t in the NBA anymore.  I can’t stay up seventy-two hours anymore, not even forty-eight.  And I can’t program nearly as well as I used to.

Older programmers can be craftier, though, just like older athletes.  Remember, productivity isn’t measured in lines of code, far from it.  Too many lines of code is the hallmark of an unskilled journeyman.  Productivity is measured in terms of how little time it takes to create how many great things that actually work.  So the third reason that superstar programmers are superstars – after the ability to do huge project alone and at a single bound – is a canny knack for the smallest solution.

I posted about super debuggers here and posted a phrase- book of programmer-speak. How programmers can manage their non-technical CEOs is here; managing technical CEOs is here.  The character Dom Montain in my novel hackoff.com: an historic murder mystery set in the Internet bubble and rubble (hackoff.com is an anti-hacker software company) is a super hacker which is, of course, a sub-breed of super programmer.  He won the freshman robotics contest at Caltech before dropping out to write video games.  You can read or listen to the novel free at www.hackoff.com.

Here’s a puzzle for superstar programmer types:

What’s the general algorithm for the longest running, non-infinite program on any digital computer?  Obviously, no I-O of any kind is allowed nor may a clock be queried. You can assume that the computer will continue to function for as long as necessary and you can ignore quantum uncertainty (I know that was worrying you). Hopefully, someone’ll answer in a comment but, if not, I’ll post the answer some day.

TrackBack

TrackBack URL for this entry:
http://www.typepad.com/services/trackback/6a00d83451cce569e200d834640b3d69e2

Listed below are links to weblogs that reference Why a Great Programmer is Worth Fifty Good Ones:

» Trying to Hire Michael Jordan from THE GOGO OHS
Tom Evslin has a great post up on hiring "Michael Jordan" programmers: Fractals of Change: Why a Great Programmer is Worth Fifty Good Ones... [Read More]

» Working with the best from It looks obvious
Many years ago, when I was young officer, a commander of mine thought me one of the most important lessons. He explained to me that you can determine officer quality by testing the quality of the officers he chooses to work with. Good officers always... [Read More]

» Superstars from SortiPreneur
Tom Evslin has a post about why he thinks one great programmer is wort 50 good ones. He primarily defends his point with the lack of interfaces creating greater productivity:If one person can do a whole project, there is a [Read More]

» Superstars from SortiPreneur
Tom Evslin has a post about why he thinks one great programmer is wort 50 good ones. He primarily defends his point with the lack of interfaces creating greater productivity:If one person can do a whole project, there is a [Read More]

» Great Programmers Or Great People? from Musings of a West Texas Business Venture
Tom Evslin has some great points to make about great programmers vs. good programmers, and in my experience, he is spot on. However, he restricts his discussion to programmers, and the reality is that his description of "good" vs. "great" applies to ... [Read More]

» Items of Interest: 2006.05.08 from Ellis Web
Things that I found interesting on May 8, 2006: Web Hostings Dirty Laundry - Record of conversation between Dreamhost (affiliate link) and hosting-review.com, a supposedly unbiased review site. Here Dave from said review site spen... [Read More]

» Building Great Products from techrageo.us
Product development cycles are riddled with problems. Sales people promise a great new, impossible Whatsit to customers then come tell you the Whatsit has to be built. Managers give engineers a delivery date and then ask engineers to size the effort. ... [Read More]

» Volume Consultancies vs. Quality Consultants II from E-Valuation of Information Systems
In response to my posting from yesterday James Sherrett has given a good description of how a Quality Consultant can draw on a network of specialists to offer services tailored to the special needs of a customer while a Volume Consultant primaril... [Read More]

» Whom to hire for a startup? from Roman's miles
You're building a new startup and hiring a team. It's quite clear you need experienced people for key positions down to team leaders but what to do with simple programmers? What demands to their experience will you make? Will you [Read More]

» Whom to hire for a startup? from Roman's miles
You're building a new startup and hiring a team. It's quite clear you need experienced people for key positions down to team leaders but what to do with simple programmers? What demands to their experience will you make? Will you [Read More]

Comments

Michael Hamilton

Tom:

Another thought, I find myself requiring approximately 15-30 minutes to fully re-immerse myself in a typical problem domain if it is written well. If not, it might be an hour or longer throughout the day as you need to reference new pieces. The amount of time it takes you to re-immerse yourself I believe is directly proportional to the cleanliness of your code.

Let me recommend looking into the book Clean Code by Robert C. Martin. It would make for an excellent gift to any programmers who are "good" but would like to become "great". The tidiness of a problem will save REAMS of development time and is in itself a great springboard into faster turnaround and better results with more maintainable code.

I would consider myself an advanced programmer and I found the book very difficult to read, not because the subject was difficult, but because it forced me to re-evaluate the way I write code. Given to a newer programmer I imagine it wouldn't be as difficult for them to get into, but they might not see -why- things are suggested because a lot of the concepts require experience to really digest.

That said, it would definitely help steer them in the right direction, give it a look.

Michael Hamilton

Tom:

I understand extraordinary circumstances call for extraordinary effort. I guess what I was getting at is that it shouldn't be that ridiculous hours have much to do with the separation of great from merely good. I will admit that I've spent 24 and sometimes even 32 hour stretches though rarely longer than that on last minute school projects that I had been working on regularly up until that point.

Presently I'm the sole developer responsible for (and indeed, the creator of) the CMS software my company uses in many client sites. It's a small company with two other programmers who rely on my code daily so it's a bit easier to manage than some multi-dozen man company. I know that some days require longer hours, you can't leave for the day and have broken the pipeline. Personally I haven't put in more than 12 hours in a sitting but that's pretty rare. 72 would be counter-productive without at least having naps to rejuvenate you... But as I said earlier, best to avoid that type of situation when possible.

When I was at a very large game studio for a workterm I got during university I got a great glimpse at the games industry and what I saw was that crunch times like the one you described are being addressed as the industry matures. Right now managing programmers is such a new type of requirement that many companies just don't have it right yet and so you see deadline creep, crunch times, and stress.

Things like that are getting better year after year and I suppose it depends on where you work and what kind of clients you cater to, but in general good scheduling should avoid the marathons you've described.

In any case, the rest of your article I found myself nodding to. I suppose it's old now, but it is still pretty relevant. I was just searching for "Programmer Worth" in google to see if I could find something on the topic your blog discusses.

Tom Evslin

Micheal:

I worked 72 hours on rare occassions - not because I had only 72 hours until a deadline but because I was at a stage in a problem when the overhead of putting it down and picking it up was greater than the pain of staying awake (I was young, of course, or I couldn't have done it).

Eating shut down after about 36 hours. It took more than a day to recover - but less than a week.

But I think the same task may have taken more than three elapsed weeks if I tried to break it into eight hour segments.

Michael Hamilton

No sane person works more than 24 hours. There is a very quick drop off of productivity, ability to concentrate, and general attentiveness with extended sleep cycles which has been demonstrated in several medical tests.

As a manager who might see a programmer has spent 24 hours at his desk you better damn well be telling them to go home and get some sleep. Product cycles usually involve a "crunch" time, but this is neither desired nor required. If you are experiencing a significant deadline "crunch" it is usually due to lack of proper design and management and should not be considered "normal".

Putting the pro into programming is what separates that top 5% from everyone else and those who fit in that category should be greatly valued. Part of valuing your employees is making sure work and personal life are in balance. Sleep is a part of that. You're looking at a future burnout if they are running 72 hour shifts, and I am not talking years, I'm saying a few months.

Erica

OK, I would like to comment on this as someone who has considered herself a great programmer. This sounds arrogant, I know, but for the particular gig I am thinking of I was the top paid developer, I created two systems basically from scratch (this place didn't believe in requirements) and when I left they hired several people immediately to replace me.

I worked long hours but not 72 at a stretch. I think throwing that in confuses the issue because any boss would like to hire someone who works 3x the hours of anyone else. The real difference lies elsewhere as I discovered.

I left to start my own company, and like the post I of course wanted a superstar too -- I didn't want to code everything myself. What I ultimately discovered, and this was a painful process, is that management plays an important role here. If you have the right systems in place for designing and communicating interfaces and requirements, a lot of merely good programmers can become excellent ones. It is not easy to do, but it will allow your project to scale, to hire from a pool of people who want to see their families occasionally, and save you when the superstar quits in two years to do something more interesting.

You do need a great architect, as an earlier poster mentioned, and here I confess I've basically had to take over. But going through the other steps has made our development team stronger and allowed us to tackle larger problems. While I generally recommend hiring the smartest people you can get, if you are dependent on a superstar I would take a hard look at your process. Great managers are important too.

Zeno Davatz

This is a great post and I totally agree with what you are saying about how 1 programmer can be as good as fifty. But one thing does not seem right to me:

> A superstar prefers to work in at least twenty-four > if not forty-eight or even seventy-two hour spurts.

The best athletes, musicians, entrepreneurs are those that have gentle and regular Rhythm. Sniper-Style-No-Sleep will do lots of damage in the long run.

Also for me: A great programmer gets greater as he grows older.

timb

"Hopefully, someone’ll answer in a comment but, if not, I’ll post the answer some day." ... well played.

Joseph Bruno

1. The computer is deterministic.

2. So its next state depends on its current state and on nothing else.

3. And a given current state can only ever turn into one "next" state.

[you can't disprove this with an "up and down" counter: when it is counting up the state is {n,UP} and when it is counting down the state is {n,DOWN}, which are two different states - so the total is N+1 bits of state, N in the counter and 1 in the "UP/DOWN" setting].

4. So the longest program is the one that goes through the most states, once each.

5. N bits of memory can have 2**N possible states.

6. So the longest program cannot have more than 2**N possible states.

7. A counter can have 2**N possible states.

8. So no program can take longer than the simple N-bit counter.

Each byte of program code subtracts 8 bits from the space available for the counter. So each byte of program reduces the running time by a factor of 256, which makes programmatic cleverness very unlikely to be useful.

Using the registers would be useful if you can do this in less than 4 bytes of code (less than 8 bytes, for a 64-bit computer). For instance,
XXX: INC AX
JNZ XXX
takes only 3 bytes but lengthens the effective "counter" by 4 bytes.

Charlie Crystle

Wow, Tom. Great post, and congrats on the attention. I passed it on to my dev team...

Eighty

If you throw together the size of the program's own "stack" (for a Turing machine it holds the position of the writing head and the position in the code) and the size of the memory into N, sure, you get an upper bound for the running time of 2^(N-code).

I think it will be hard to attain that upper bound though. The stack would have to run through all its permutations an equal number of times, and that puts more restrictions on the problem (for example, if the stack is stored in binary and each counter in separate bytes, the size of the program and the size of the memory have to be powers of 2).

But for "any digital computer" you don't have the luxury of deciding how much space you want to use for the program, its stack, and the memory. If you're only given the size of the memory, and a maximum size for the program and the stack, the problem becomes much more difficult.

CS Prof

Counting is the answer.

With no outside influences our program must be deterministic. Given some state A, such an algorithm will always proceed to the same next state B in the deterministic order. If we only have the bits in memory to represent such states, and if there are N such bits available, then only 2^N states can be uniquely represented. Thus, a deterministic algorithm could only take at most 2^N steps. A counting algorithm is likely the simplest way to visit each of these 2^N unique states.

Aside: If you assume that you also have extra bits elsewhere in the computer to help represent states then that merely adds to the value of N. Practically, some number of bits must be set aside to form your algorithm code. Yet, a simple counter starting with all other "data" bits as 0's and adding 1 until it has all data bits as 1's will likely take the fewest number of program bits to represent.

Finally, the problem with using StupidSort or "counting up, then down" is that you have to assume that you have enough bits in memory to represent each state that you're visiting. For example, assume that you have a counting algorithm that goes through all possible combinations of the N available bits and thus through all available state representations. If you want to count up and then back down to double the running time, then you need an extra program bit to determine if you're currently counting up, or down. Removing this data bit decreases the number of states you can represent by half. So, no gain in longevity was achieved. For more information, see computer science complexity results, such as those relating space complexity classes to time complexity classes. (i.e. P <= NP <= PSPACE)

Doctor

So, let's throw some Kolmogoroff complexity in the mix: the size of the program is also significant. Basically, you age given n bits of memory. You can program your machine to interpret the memory as code or data. The solution is to make the machine go through every possible state of those data bits for 2^(n -code) steps. Making code bigger makes data smaller for no net win.

UJ

I wonder how all the great programmer has enough time to read and debate on worth of good and great programmer when they have to put 72 hrs a week to meet the deadlines.
Meeting deadlines with quality and manageable code is justifies success.
I would say good programmer are as good as great programmer as long as they finish their work in given deadline.
If a programmer can bring more reusability and better performing code, he may qualify for better programmer but nothing more.

Eighty

As stated before, a counting program is not the answer. If you make the program larger you can run through all permutations of the memory many times (consider a Turing machine with one memory cell and a program of 100 instructions).

There can't be an algorithm; an algorithm has a certain size, and for a long-running program you want a long program (compare with the Busy Beaver problem).

auriga

Well the purpose of the software industry is to make money, not invent solutions to academic problems. So a great programmer would ask: Why do you want to solve that problem? It might turn out that you do not actually need a solution to that problem, but to another one.

biff

void main(void)
{
int x;
looooooooooooooooooonnng int y = infinity;

for (x=1; x<=y-1; x++);
}

Vince P.

Solution in C#:

using System;

namespace ConsoleApplication1
{
class Class1
{
[STAThread]
static void Main(string[] args)
{
while(true)
{
Console.WriteLine("Hi there!");
}
}
}
}

There you go.

I can hear you now. "But.. but that's infinite!" you say.

Really? Prove it.

Stupid question really.

Dennis

To everyone who thinks they can make the longest-running program simply by counting up to a number consisting of all the bits of RAM set to 1...I can double your running time by making the program slightly larger and counting back down to zero.

You're not cycling through all possible running states if there's a separate program in there doing the cycling.

Rintoul

How can someone with a blog which cannot be printed easily comment about "great programmers"?

w durden

I agree there is some uncertainty in the problem statement. What is finite for instance when the problem also says that it will run for as long as it takes... For instance, zero all of the ram and randomly flip one bit and check for all 1's. The probability of eventually hitting all ones in a finite size memory is 1 if we have an infinity of time to wait. (If you reduce it to some meaningful small amount of ram we pragmatically know this is true. While folks might argue, were do you doubt it is pragmatically true? Will it happen with two bits in finite time? Four bits? How many bits of ram do I have to have for practical experience to say it really might never hit all 1's.) Its a limit problem. If we have infinity to wait the probability = 1. If this doesn't violate the infinity aspect of the problem, then randomly flipping bits and checking has a probability of being longer than any count based approach, but it is not GUARANTEED to take longer than a count based solution, and it is only GUARANTEED to finish at infinity limit though pragmatically with a finite memory store it must finish. If this violates the problem then you have to fall back on iterations through all of the machine states ...

Casey Marshall

Doesn't "longest running" and "infinite" kind of eventually both end up at the same place? Maybe I don't understand the wording of your question or what you're looking for.

To burn a lot of cycles I would probably do a Towers of Hanoi with 64 rings. I have heard that takes a good long while. Of course, you can beat me by adding another ring, but I am sure the hardware will fail first :)

So here is a counter-question, which maybe will enhance my understanding of the problem. Can you describe a non-infinite program that cannot be made longer-running by increasing some factor of N, whether than N is the number of bits in a counter, bits in a keyspace to brute-force, or rings on a post, etc, and is still longer than all of these, for any aribitrary size N?

If you can, my intuition tells me that solution would be infinite. If that is not the case, I am prepared to be enlightened.

w durden

There is a mixture of roland and nigel's answers that would work better. Use a bitindex of the ram to keep track of your position therein. Count (flip the bits of the ram below the index) to every bit state below the index the number of times the bits above the bitindex can be counted (i.e. create every machine state below the index repeated by the number of machine states above the bit index), then increment the bit index one bit. This runs longer than Rolands because it creates every possible state lower than the rammax multiple times. Unfortunately my brain was/is warped with imperative language thought patterns, so give me a bit before I post something in pseudocode.. All this of course divides out the program memory space from theoretical RAM space. Our digital computer is part finite machine RAM and some other part that can set those states. Reality intrudes and makes us carve out part of one for the other....

Back to the problem, so for instance, say we have eight bits of ram. When our index is at bit three we can count to 4 below the index and 32 above (2 to the 2nd and 2 to the 5th). We end up setting up all the states below multiple times. Here reality is our friend if we assume the binary value of our digital machine is much larger than the maximum counter we might have in some "other register". We are using the ram as both machine state and counter....

Jimbob

I've met those programmers that work in 72 hour stints. They produce garbage. After the first 12 hours, they can't think clearly enough to produce coherent code. I've been the person to clean up their mess more than once, and let me tell you, I could've done a better job working 8 hour days for 3 days than they did in the 72 hours straight.

A. E.

This is not the same as the Busy Beaver problem. The counting algorithm is the correct answer.

The program will run infinitely if the memory state repeats itself. Hence we want to go through the maximum amount of states without repeating, which is easily achieved by a minimal counting program. Just wipe all the memory and start counting. It's probably not good to count using the registers, unless the code takes less than 32 bits (or whatever the register size is) per register.

I don't know if knowing this proves that you are a great programmer though. A mathematician would probably be able to figure out the same thing without knowing a single programming language.

Dean

I don't believe you. There is no such thing as a "great" programmer that's worth 50 "good" programmers. Even Michael Jordan was not worth 50 good NBA players.

While I agree that one person doing a project alone saves a lot of time by not having to communicate with anyone else, I don't think that's meaningful, except in the smallest of projects.

Also, the "great" programmers in your example appear to be only designing and coding. This is unrealistic. There is so much more to a programming project than that. You have to talk to users about requirements, you have to test, you have to document. What about usability? If you're building a UI, what about making it good-looking? You have to "interface" with a lot other people for all these things. At least I do. In other words, your "great" programmer has not finished your hypothetical project for you.

And your example of a "great" coder sounds like a real cowboy to me. I wouldn't trust the work of anyone who worked in a "seventy-two hour spurt", or even a "forty-eight hour" one. But even if someone could produce excellent code that way, so what? Why is that better than someone producing excellent code in eight hours a day? Working seventy-two hours straight does not make you a faster coder. All you've done is compress eight days into three--the coding takes the same number of hours. I don't agree that there is an "interface" problem for a single coder stopping work for the night. When you're deeply involved with a project, it stays in your head even if you take the time to go home, eat, shower and sleep.

I think it's a disservice to tell employers they should try to get "great" programmers who do the work of 50 "good" programmers. Because, like Santa Claus and the Easter Bunny, they don't exist.

Gaurav

Well, the problem with having one great developer as against 5 good ones is that if that one great developer leaves, you are screwed. I think one great software architect who can design the system well is of greater value. Good design is more often than not easy to code to.

Notany

The answer for programming question is this: Longest running program (infinite) changes only one bit of the programs internal state per operation. If you use proper programming language (it must have big integers), the answer may be something like this:

let x = 0
while x < MAX_SIZE-OF-MEMORY
do
x := x + 1
end while

Any good programmer can answer to this guestion. But then. Only really ngood programmer can spot great programmer. Trick questions don't usually work becaus superintelligent programmer is not nesseserily superprogrammer (there is correlation of course).

Richard Freytag

Tagging onto Scott Aaronson's excellent article on Big Numbers
(see: http://www.scottaaronson.com/writings/bignumbers.html)
that Ben mentioned, I took a stab at proposing a bigger number than Scott Aaronson's
(see: http://www.freytag.us/twiki/bin/view/Freytag/BigNumber).
That page has a link to the longest running Busy Beaver algorithm I could build using the longest number I could imagine.

For numbers this large and Busy Beaver's this long we have an excercise in program proving as we've got to prove these programs will halt. The alternative of running to completion is simply not feasible given time available before heat death or collapse (or choose your flavor of cosomological end).

Sadly, I'm pretty sure this says nothing about me as a programmer.

Brad

Sounds simple but the question is how do you tell the good programmer from the bad one. The bad ones often look good.

Dennis

Predictably, there are comments reframing the debate to: "Why a competent developer is worth 50 incompetent ones"....which is also true, and much more comfortable; because many of us are competent (though many more are incompetent), and few of us merely-competent programmers are willing to admit that there is a smaller minority of truly great ones.

I'm a competent developer. I'm not a great one. I will never be the equal of Alan Kay, or even Paul Graham. However, recognizing my inadequacy makes me keen to read the writings of people like that, and try to figure out why they're better than me. Doing that has made me a much better programmer than I was even two years ago.

Good, professional development practices are important. But being a self-satisfied "professional developer" can blind you to the fact that it's possible to get much, much better at actually writing code.

Dave Eaton

In the hey-day, I remember a great programmer who got himself into the position of single-handedly writing a standards-compliant web browser with built in HTML WYSIWYG editor. He indeed worked 100 to 120 hour weeks. He saw little of his wife and kid. He is proof of what this article states. Other super programmers never seem to do more than 12 hours in a day. I doubt longer hours can be sustained long-term. See http://c2.com/cgi/wiki?EightHourBurn

vanjulio

I'll get in trouble if I don't repair my last post.

Halting problem is just that you can't write a program that is going to decide if another program will terminate, just on its description. It is undecidable in the general case.

So you'd be waiting forever for someone to come up with the solution I guess is my answer! The answer will never come because you'd have to examine every possible program.

vanjulio

The longest running program would be the one that waits to see when the longest running program will halt. This is the halting problem. It is Turing undecidable.

But if you just want one that runs forever:

[true] whileTrue: [Smalltalk beep].

Oleg

Well, computing all possible permuatations of characters a long string does take a while. If you pick a long string, let's say, 10000 characters, it would take more time than the age of the universe.

Oleg

Well, computing all possible permuatations of characters a long string does take a while. If you pick a long string, let's say, 10000 characters, it would take more time than the age of the universe.

Robby Slaughter

The problem with this article is that it uses the wrong superlatives. It ought to be titled: "Why an Actual Software Developer is Worth Fifty Unskilled Hackers". Of, course then, no one would read it. Most "good programmers" are actually not very good at all. For some reason, people learn a bit about HTML and read a dummies book on VB and think they are experts. The industry is so desperate and cheap that we hire these people.

John

"A superstar prefers to work in at least twenty-four if not forty-eight or even seventy-two hour spurts."

Everything is true, except this statement which IS complete fantasy.


Simon Jester

I like Erik Peterson's answer best.

I think Tech Sync makes a reasonable point, but the article works even better if you say "system developer" rather than "programmer". (A system developer being someone who has all the necessary skills for the project in question, rather than "just" programming skills.)

Ben

If I correctly recognize your problem, this is a re-phrasing of the busy beaver problem (do google, or read the excellent article here: http://www.scottaaronson.com/writings/bignumbers.html).

The busy beaver number B(n) is "What is the maximum number of steps for a Turing machine that halts, and has 'n' instructions."

Basically this is uncomputable, and grows uncomputably rapidly. As in, faster than (x! ^x!) for example.

(For the person who noted that Stupid Sort was longer than just counting, this should be a tip off that higher algorithmic complexity is possible. And in terms of the problem, the no I/O restriction is a bit unclear -- writing to a file isn't allowed, but writing to memory is?)

Nigel Sedgwick

Apologies for my late posting, which replicates your own stuff. I am new to your blog and came in through a link straight to the article page.

Best regards

Nigel Sedgwick

I'm with Roland (having looked at "cleverer" things involving pseudo-random number generators). The state space is determined by the number of bits in the machine memory; I don't think you can make it larger. No I/O is allowed, so a true noise generator cannot be used. One or more states must be the termination condition, as otherwise the program is not guaranteed to terminate; guaranteed execution is longer with just one state for termination; make sure this state is found last (so all my thoughts on random numbers was pointless).

The only addition I have is to compute something long within the loop counting down. However for that, the code to execute the longer stuff would have, itself, to execute for longer than that from using the same memory to increase the state space size of Roland's count-down algorithm. That would be a machine-dependency that might or might not be true. [I hope I've got that right.]

Best regards

Roland Turner

Assuming that

- the question is actually for a "finite but bounded" digital computer (i.e. not a "finite but unbounded" Turing machine)

and

- by "non-infinite program" is meant a program which will halt eventually (in principle at least - even if the hardware does not decay) rather than, say, a program recorded in a finite amount of space

the most general non-repeating program would simply be a binary counter which uses the whole of RAM (plus registers, etc.) as its stored value, starts with all bits at zero and halts when all of the bits have reached one. Even with present-day computers, the energy cost of flipping that many bits would exceed the entire remaining energy in our sun, of course.

Erik Peterson

while (not_infinite = true) do {something();}

tech-sync

How to identify a Great Programmer?

Many times we require many other skills like design, architecture, testing for which one need not be a great programmer. Any comments?

www.Tech-Sync.com

Fred Wamsley

Of course I had corrections to the above by last night.

First, that should read "Turing machine with finite tape".

Second, except for registers it doesn't mention running through various CPU states. Given the number of gates in a modern CPU the number of possible states is something we don't have words for.

Bringing up the real point: the longest-running program possible is an exhaustive self-test.

The practical answer is that the longest-running program is one that keeps going until the hardware dies of electromigration. Assuming perfect hardware, the answer is that all programs are limited by proton decay turning the hardware into a bunch of neutrinos, something which would happen long before reaching every possible combination of states.

Fred Wamsley

Well, a Turing machine has a finite if unimaginable number of states, so the way to keep it running is to go through as many states as possible before reaching the halted state.

The first thing that comes to mind is the StupidSort algoithm, applied to the whole of memory. This is a sorting malgorithm that generates all possible permutations and tests each for sortedness.

The next thing I'd check out would be treating all the registers and all the RAM as a single multi-precision int, zero it, and count until it overflows.

Ways to slow it down further might include spin loops, but every byte used for code takes one byte away from the RAM whose size governs the slowdown.

A better refinement than a spin loop might be to toggle bits in, say, 16 widely separate areas of RAM on every iteration so as to destroy any chance the machine might have of getting a speedup from cache. Or better (since it's smaller code) count by some large number relatively prime to the maximum bignum.

At a guess StupidSort is going to win: it scales like N! * N / 2, where the counting alogrithm is more like 2**N. But that depends on the size of unit that StupidSort is working on: it will converge too quickly if sorting single bits or if sorting chunks each of which is half as big as RAM.

Post a comment

If you have a TypeKey or TypePad account, please Sign In.

Now on Kindle!

hackoff.com: An historic murder mystery set in the Internet bubble and rubble

CEO Tom Evslin's insider account of the Internet bubble and its aftermath. "This novel is a surveillance video of the seeds of the current economic collapse."

The Interpreter's Tale

Hacker Dom Montain is in Barcelona in Evslin's Kindle-edition long short story. Why? and why are the pickpockets stealing mobile phones?

Need A Kindle?

Kindle: Amazon's Wireless Reading Device

Not quite as good as a real book IMHO but a lot lighter than a trip worth of books. Also better than a cell phone for mobile web access - and that's free!

Recent Reads - Click title to order from Amazon


Google

  • adlinks
  • adsense