Displaying Newsletter
Issue 39, 03 Apr 2003


  In This Issue:
 
On Languages by Michael Phipps 

In my last year of college, I took a number of "advanced" computer science classes, including a class on compiler design, a couple on operating systems and three on computer architecture. From these classes, I drew some conclusions.

There is a small number of fundamental operations that computers perform. Really, when you get right down to it, a CPU today really does very little - it does basic math, loads and stores values into memory, and can change to different instructions (jump). There is very little complexity in the concepts of the CPU. The complexity comes in the details.

Many languages that we use today have very primitive roots. I saw a chart that showed the progression of every language from Fortran, COBOL and Lisp all the way up through C#. There was and is a lot of evolution going on, but not a whole lot of revolution - not a whole lot of radically different thought.

In most modern day languages, there are some intrinsic types (character, integer) and composite types (not known inherently to the compiler). Because the compiler knows about intrinsic types, it treats them a little bit differently than other types. They have less overhead, they compile down into machine specific instructions and they require no special effort.

All code consists of the same sorts of things that a processor does inherently - basic math, loads/stores, and jumps (conditional and non-conditional).

Finally, I saw a figure that showed that C has more than 40 key words and C++ has more than 100 key words. That seemed too complex to me.

So, as these ideas started to germinate, I came to the conclusion that we needed to build a language that was different. Austere and simple, yet as powerful as anything on the market. The introduction of Java and its struggles with library support showed me the true power of programming languages: pre-built modules. In the "olden" days of computing, you could build a language that had no more pre-built support than read and write a text line and it was acceptable. Since the 80s, it has become clear that no developer wants to start from scratch, building every component that he will ever need - today's schedules and applications just do not allow that. So the language needs a great support library to begin with.

So what should this language be like? The first criteria is that it can not be tied only to text files. Part of the problem that we have today with programming is that we are stuck in the 1950's with compile/link technology. There are whole books written on how to restructure your code so that the compiler can figure out how to deal with your code correctly and efficiently. Certainly compilation needs to occur. But it should be possible to build the link stage into the loader so that applications load and link dynamically.

The next criteria is that it has to be easily comprehensible to beginners. We (computer scientists) used to understand this. A young child could pick up a book on BASIC or LOGO and write small applications. Sometimes they could write whole, complete apps. This was something of a necessity back then, since the prepackaged software market wasn't anything like it is today. Still, the ability to make a computer do what you want it to is a feeling like no other, but the difficulty of most languages has placed a significant barrier to entry in front of most people. If they can't get some sort of result in an hour or so, they will quit trying. That is, in my opinion, the reason for the success of languages that we consider to be poor in quality today--like Visual Basic--a non-skilled, non-degreed person can sit down in front of VB and make an application that will do something in a short period of time.

Finally, the language has to be powerful. The easy languages of yesterday were too slow and too limited to make great applications out of. It was a common to hear, back in those days, that if you wanted decent performance or to do certain things, you would have to write in assembly.

Starting over with a clean slate allows the decisions of the past to be reviewed in a different light. One example of this is the use of key words or curly braces to demarcate blocks of code. Of course, the first thing that you are taught in programming class is to indent those blocks of code for human readability. But... wait - you need to separate the code one way (braces) for the compiler and another way (white space) for the human? Why? Another similar example is the use of statement terminators (usually a semi-colon). But the next thing that you are told in class is that you should have one statement per line. Why is that necessary? Human readability. Hmmmm... One more example - one that is sure to be a little more contentious - in order to build your own types, you have to explain to the compiler how that type should look and work, but you need to separate the two into two separate files - a look file (a header file) and a work file (a code file). All because we are stuck! ...in a text based paradigm.

How does all of this tie into OpenBeOS? I came to BeOS from the Amiga. On the Amiga, we had something called ARexx. It was based on IBM's REXX language and was used extensively for scripting between applications. A language wrapped around "hey" type functionality. I was able, for example, to write an ARexx script to make an application that I had purchased (Art Department Pro) to open pictures, rotate them, scale them and save them in another format. All in an automated fashion and in a few lines of code. The pseudocode was something like:

for each file in mypics: 
        adPro.load(file) 
        adPro.rotate(-90) 
        adPro.scale(.1,.1) 
        adPro.save(mypics:file.jpg,"JPG")

How easy is that? Of course, load, rotate, scale and save would all be "verbs" (in the BeOS scripting sense) exported by adPro. Still, something like this would be a phenomenal tool. Imagine with OBOS' media capabilities. One example that comes to mind is the ability to build your own digital video recorder with something like this. If the language was easy enough and the components were all available, this would not be all that hard. How about an application that automatically creates still shots from full motion video, captioning them with close captioned text, and posts them on a web site? Or maybe an application that e-mails your cell phone with important e-mails that come to your home address? The number of things that you might want to do with your system when scripting is this easy might surprise you, when you look at it from a fresh point of view.

 
How Do You Do It? by Michael Phipps 

Many people ask me how I do it - how I find time to write code, write newsletter articles, respond to all of my e-mail and oversee all of the team leads and so on. Generally, I respond to these questions on an individual basis. I thought that it was time, though, to answer in a more general and comprehensive way.

Time management is never easy. That admission is probably the best way to start. It requires self discipline and reduces some of the freedom that you have in your life. At least, that is how it seems at first. The first step for me was to decide that the project was very important in my life. More important, say, than playing video games. More important than watching mindless television. More important than the project that I had been working on. In fact, I placed OpenBeOS higher on my priorities than anything except my family. For me, that meant cutting out a lot of "other stuff" that I used to do. It meant putting aside the things that aren't important and replacing them with the things that are. When everything is going well, this isn't too hard. It is only when the motivations are fewer and farther in between that this is a real struggle.

Now that the priorities are figured out, the next step is to arrange your life so that you have more time. Part of this "found time" can come out of the items that you agree to cut out of your life (from above), but you will find yourself resenting your free time project if you cut out all of your fun activities in order to participate. So try, instead, to cut out non-fun activities. One example is that I have started "work to rule" - 40 hours a week and not a minute more without overtime pay. Another choice is that I sometimes skip lunch period - I eat at my desk, quickly, and leave work early. I also try to consolidate my driving and my chores so that I waste as little time as possible. I also try to gather as much time as possible into my most productive time - early evening.

So now we have some time made. How can we spend that time? As much time as possible should be spent in the environment that is most conducive for your working style. For me, a quiet room without interruptions is the best. I have a home office and my own PC to work on. That allows me to go deep "into the zone" - totally focused on the problem at hand. Others find a more noisy environment (music, TV, whatever) helpful.

Finally, committing to spending time is important. Many people incorrectly believe that they need to send dozens of hours a week on an open source project in order to be helpful. I assure you that this is not the case. I would be thrilled to have people who say "I will commit to 4 hours a week in 2 2 hour blocks, probably no more" and stick to it. When I know that up front, I have a reasonable expectation of them and can suggest things to do that are small enough to be completed in that sort of a time frame.

So what happens when the "get up and go" is gone? When you just don't feel like working on it? This is what Robert Pirsig in "Zen and the Art of Motorcycle Maintenance" calls a gumption trap - a temporary lack of desire to work on something that you normally very much want to work on. Pirsig recommends (and I agree) that you should take a break. Leave it alone and go and do something else. Every so often I have to do that. The times when I have tried to "force it"--push ahead even when I am not in the mode--well, the resultant quality is just not there. It is very important that it stay fun. If that means taking a break, well, that is OK.

Probably the most important aspect of all of this, though is to remember not to forget what is really important. Don't ignore your family and friends. Don't ignore your classwork, your paying job or your hygiene too much. All of these are necessary parts of your life, too. Putting too much of your attention on any one part of your life will cause you all sorts of trouble and stress later on. A balanced life is like balanced nutrition.

 
Common Sense by Michael Phipps 

Today someone asked me if I thought that tablet PCs were going to "make it". I responded that I believed that they would fail, just like last time. He looked blankly at me and asked if this had been tried before. This led me to think a little bit about the industry that we are in and the nonsense that is perpetrated on the public in the name of progress.

It seems to me that the first step is that an idea or a theme is conceived or recycled, in some cases. They often seem to be reactionary - a sharp change in direction from the current direction. We have seen this many times in just one area - distributed vs centralized. We went from mainframes to desktops to client/server (centralized) to heavy client (distributed) to web based. So the next trend should be distributed again. We shall see.

The concept catches the eye of a journalist in one of the weekly industry rags and it gets an article. Suddenly, like a thousand wildflowers all at once, there are products announced, mergers initiated, and a whole flurry of activity commences. Books, written under great pressure and without a whole lot of regard for correctness or completeness, are churned out to rake in sales from managers and trend followers who need to learn the latest buzzword.

Prototype and initial products are released and panned by the reviewers and public for being exactly what this sort of a frenzied atmosphere must produce - a half baked solution. Most concepts and ideas end at this point. The prime example is the Internet appliance. Not to beat the dead horse any more, but what was fundamentally a decent idea died an ignominious death due to the garbage that manufacturers tried to sell to an excited crowd.

Some "trends" survive, though. XML, Java and windowed operating systems are among them. Many fail - tablet PC's (the first time, as well as this time, unless I miss my mark), IAs and object-oriented databases are some good examples.

What is the differentiating factor? What causes some ideas to flop horribly and others to persist? Part of the difference is the quality of the implementation. XML is a good example here - XML 1.0 is a pretty decent standard. Sure, many of the tools were poor, but XML itself is "good enough". Part of the difference is the level of hype and expectations. Object oriented databases were going to radically change the way that everyone deals with databases. Except that there is a huge group of people out there who don't want it to change - they have a lot of time and training invested in relational databases. Part of the time, the idea is just a poor one. This is the category in which I place Tablet PC's.

I started writing a newsletter article on my Palm Pilot. I got through two sentences before I realized just how much I hate writing. Typing is *so* much faster and more efficient. I can type a word in the time that it takes to write a letter or two. Mistakes compound this disparity in performance. I switched to the built in keyboard GUI tool. I was 3 times faster, at least, tapping letters than writing them. Finally, I pulled out the portable keyboard that I bought with the Palm and I was half again as fast with that. As I was describing this situation to my friend who was inquiring about the eventual fate of the Tablet PC, it became obvious to me that the Tablet PC, while being really cool technology, is not for most people.

Unless you are a really slow typist, text entry of any sort will be *far* slower for you than even hunt and pecking. So it seems that the niche for a Tablet PC would be people who don't need to enter a whole lot of text. That seems pretty limited - UPS drivers and warehouse employees, maybe. Far more limited than the Microsoft ads would lead me to believe.

A common sense approach to these fads needs to be applied. Look at them and ask yourself:

  1. Are the people in charge of decisions going to benefit?
  2. Is there a pressing need that existing technologies/systems don't fill well?
  3. Can this be done fast enough and cheaply enough that people will buy into it?

If the answers to those questions are not all yes then don't bet on the technology going anywhere.