Thursday, February 9, 2017

Case in programming identifiers

By no means a comprehensive list, and feel free to give me alternate names or etymologies...
  • LoremIpsum
    • CamelCase
      • This is ambiguous and I really don't recommend using it. camelCase traditionally starts with a lowercase letter (do camels have a hump on their heads?).
    • UpperCamelCase
      • It is exactly the same as camelCase except with the first letter capitalized, and it is unambiguous, but this is rarely used.
    • PascalCase
      • This is the preferred case in Pascal, and I would dearly love for people to consistently call this case PascalCase and just get on with it.
  • Lorem-Ipsum
    • Train-Case
      • This is not something I have seen in the wild, but I have heard of it. A plausible etymology is a series of boxcars linked by couplings which aren't quite at the middle (as in spinal case) or at the ground level (as in snake case) but closer to the ground, over where the wheels would be.
      • Largely superfluous- if you want to use capital/lowercase letters together, then why would your standard use this and not camelCase or PascalCase?
  • lorem-ipsum
    • spinal-case
      • As in, the words seem to be connected centrally through a "spine".
    • kebab-case
      • As in, the words seem to be pierced through the middle by a skewer, as a kebab. Yum!
    • lisp-case
      • This is the preferred case in Lisp, and some refer to it this way, although I would call it somewhat less common these days.
  • loremIpsum
    • camelCase
      • The first word is not capitalized, but the second word is; the increase in case looks like a "hump" and reappears on subsequent words.
      • Common in Java, JavaScript, and myriad other languages.
      • In some shops, it's common to use PascalCase but include a Hungarian prefix lowercased; this results in camelCase (e.g. szCamelCase) identifiers, but Hungarian is also diminishing in popularity.
  • lorem_ipsum
    • snake_case
      • As in, the words are connected along the ground, like a slithering snake... why'd it have to be snakes?
      • Common in C/C++ identifiers, although I don't like it. I prefer camelCase even in C.
  • LOREM_IPSUM
    • SCREAMING_SNAKE_CASE
      • Taken as a modifier of snake case for its use of underscores, the SCREAMING aspect refers to making all letters capitalized, but I haven't heard SCREAMING in any other case- possibly because the idea of a screaming snake is particularly goofy.
      • Universal for C/C++ named constants.
  • LOREM-IPSUM
    • COBOL-CASE
      • COBOL is case insensitive (always capitals), and they don't support many delimiting characters, so hyphen is the best choice.

Monday, February 6, 2017

Dell XPS 13 9360

I want to state upfront that I am suspending my overall judgment of this machine until I can identify how difficult it plausibly is to fix my issues and how widespread these issues likely are. The last time I was buying a laptop, it was 2011 and I spent about $370 for a cheapie Lenovo, having no expectations other than buying the cheapest thing out there which had an i3 and 500GB HDD. For something more than three times the price, I expect a good deal more. And right now I'm getting it, but I'm kinda not at the same time.

This article will cover my impressions of all aspects of the machine, and will be updated continuously.


The good stuff

For my money, on paper and when checking it out in the store, this is the best laptop out there.

Prices starting at $1000, rising to $1300 for something with guts. I ordered mine with all the top specs except touchscreen, which unfortunately meant I was limited to 8GB RAM, which is the only hardware drawback. Kaby Lake i7 runs fantastic. Everything blazing fast with 256GB PCIe SSD, restarts are faster than I've ever seen for a laptop. It is only expensive compared to laptops which are made with lousy keyboards, lower specs, or such bulk and low battery life that they can't be considered effectively portable.

It has the same solid build quality and beautiful machined aluminum exterior as a Macbook Pro, with the unique addition of a carbon fiber touch surface (it feels exceptionally nice and keeps your hands cool and avoids smudges). Costs hundreds less than the Mac, but I'll be damned if I can spot what they have saved money on! It's not a surprise that Apple sucks on performance per dollar, but their industrial design is top notch. Have I been asleep for a long time and Dell has been making good-feeling PC laptops for years? Could be that I've not been paying attention, but this XPS surprised me in the store.

Keyboard has a delightful feel. I'm partial to clickiness and some tactile feedback but also a feeling of robustness. The feedback is roughly similar to Surface Book and high-end Lenovos that I've used, but those feel really chintzy, like I'm going to destroy them by punching the keys all the time. HP Spectre 360's keyboard feel robust, but I don't concur with the contention by some reviewers (Ars Technica and others) that the feedback is high-quality; it felt spongy to me, so that was right out despite some other merits. Newest Macbook Pro's keyboard is severely disappointing, the stand-out worst of the bunch (although I sort of liked the previous generations) because the keys are so flat and the travel so small. It's almost as bad a membrane keyboard. You really feel like you're just slamming your fingers against a solid surface. Not good.

I don't want to keep harping about Macbook Pro, because it was not on my buying list- but I include it because I can respect it as a fine, well-built machine. I don't like macOS and I don't like Apple's proprietize-everything ways. But even if you were to take those drawbacks away, I would prefer the feel of my Dell, which is something I never thought I'd say.

Battery life on the XPS is close to leading, if not leading. I can't verify an exact number, but I'm impressed with how it handles running all day. I can run it all day long and it still has a decent charge. The thing also charges with USB-C. So if you have a phone that charges with USB-C anyway, you could pack exactly one cable in your bag and charge both devices. Joy of joys!

The minor irritations

This may come just as an indictment of Windows 10 in general, but it's full of fluff. I had to remove Minecraft and Candy Crush and a few other games- since when were these built in to the OS? There are ads on the lock screen, ads in the start menu. All stupid, all easy to remove. I'm running Windows 10 Pro, by the way.

My first task was to use the Chrome Installer Finder (also known as Microsoft Edge) to get a browser on my PC. There are some slight improvements that Edge makes over the previous generation of Chrome Installer Finder (aka Microsoft Internet Explorer) but it's still pretty lame to use- thankfully you're only going to use it once before you get Chrome, and then the problem is academic.

Some annoyances I know are Dell-specific. I had McAfee pre-installed, and the second thing I did was remove it. Had to listen to it crying crocodile tears in big red letters that my system would burn to the ground without its protective aegis.


Ugh, wifi!

This is the meaty, juicy issue on this laptop. It's a shortlist of one. One gruesome issue out of the box is not a deal breaker weighed against the other virtues of this machine, and I am prepared to remain pleased if I can just overcome it.

https://www.reddit.com/r/Dell/comments/5hkiur/new_dell_xps_13_with_horrendous_wifi_connectivity/

The issues with the 9350 and 9360 are apparently well-documented by others, but the actual issues people mention are all over the place. It drops wifi connections, it doesn't connect to anything, it doesn't maintain the connection for more than 2 minutes, it has slow connectivity, or it doesn't even show networks. I didn't have any of those problems specifically, but bad driver/hardware could be the root cause of all of them and some.

My specific problem is two-fold:
  • I can connect to open wifi networks, but it doesn't resume the connection after waking from sleep, and it sometimes drops randomly even though the signal is unchanged in strength.
  • I cannot authenticate onto any encrypted wifi network. I know I get passwords wrong 50-70% of the time, but eventually I will enter it right. Never any joy here.
My progress:

  1. First attempt was letting the PC OEM (Dell) come to the rescue in the least intrusive way possible. Fuck phone calls, I want to see if you have some automated tool to help. And, actually, they do- a tool  called Dell System Detect that analyzes your PC for irregularities for drivers, diagnostics, &c and reports back on its findings. Seems to be fairly clever and quite comprehensive- if it actually provides meaningful feedback. Alas, it said everything was hunky dory, and that was not what I wanted to hear.
  2. Second attempt (I really hate calling tech support) was to see if manual driver updates will help. They handily make the driver installer available for your Killer 1535 wifi card, but after going through the hopeful process of installing and restarting the laptop (super fast restart, thanks PCIe SSD!) it made no apparent difference.
  3. Third attempt will be to finally call Dell and see if they can fix it, or failing that, send me a new wifi card. 


The call (2 hours of it)


Tech support guy was really nice, and it must be said, patient as hell. I have no beef with the technical aptitude of him or his supervisor- more of their wishful thinking that it was a simple configuration issue that I had overlooked.

Ultimately he took over my PC remotely and went to the driver download page and installed every single driver update offered, which I thought to be a bit overkill (what does the virtual buttons driver and serial IO driver have to do with wifi per se?) but hey, he has a process to complete. Ditto the annoyingly long list of diagnostic tools he downloaded onto my system- I'm just trusting that this is all protocol. Curiously, one of them was self-deleting after a few days. It prompted me to say that the session had expired and called an uninstaller for itself, which is something I haven't seen before.

After he ran all that and all the settings and drivers looked fine and up to date, I demonstrated again that logging in to a secured network was still impossible. He talked to a sup, I guess, and decided to go into Device Manager and tweak the preferred frequency to 5GHz when it was formerly no preference. That didn't work, so he tried 2.4 GHz. At this point I'm just drumming my fingers.

We still had the issue of not being able to connect to my office router (and, I promised him, my home router as well). So he tested with me setting up a wifi hotspot on my Nexus phone with a simple password. I actually was able to connect to it- but with the caveat that I connected to it about 6 inches from the device. However, to my chagrin, it was considered by him enough rope to hang my argument that accessing an encrypted wifi network was systematically impossible.

The tech was polite, and so was his manager, but I insisted that the problem was not solved and I needed some more effective remedy. Citing the reports of others with wifi issues, I asked for a replacement card and offered sufficient understanding that I would have the aptitude to open and install the new card. Moreover, they offered that although they could not guarantee a warranty for issues caused by my own labor, the warranty would resume if the machine functioned with apparently no degradation after the replacement. Satsified with this, I accepted their offer of a new card. Unfortunately, they said that they'd be sending me another Killer 1535. Seriously, does Dell just have a warehouse full of these things? I really, really wanted to control for that factor!

So, basically, I'm predicting the outcome breaks down like this.

  • 40% chance that replacing the card changes nothing because this is just not a good wifi card, whether due to hardware or driver issues, on this particular model.
  • 30% chance that it has another, different problem and is still just as useless to me.
  • 20% chance that something about the production process, possibly antenna placement, possibly the antenna itself, is wrong, and that by manual adjustment or other replacement parts I can fix it.
  • 10% chance that I actually have a dud wifi card and replacing it with no other alterations fully fixes the problem.


When it comes in the mail, I will install it with fastidious care. If the new card fixes the issue, I will attempt to the best of my abilities to provide a post-mortem analysis.

The final option, barring the failure of this, will be to buy something which others say is better supported with drivers on this particular device. A lot of people with XPS 9350s have found this Intel 7265 very suitable for Linux, and a handful have said it plays better with Windows than the OEM one too: https://www.amazon.com/Intel-Generation-Wireless-Computer-Notebook/dp/B00RCZ4I6S?tag=wpcentralb-20&ascsubtag=UUwpUdUnU41865

There may be other cards which would work equally well or better. If I get to this stage, I will research more and select the one I think is best.


To Dell

If you're seeing all the words I'm writing, understand that my criticism is meant to be constructive. If I didn't think you had a damn fine piece of hardware, I would not be wasting time with this and I would have returned it already.

I want you to improve this product on this issue so that I can actually recommend it to people. I will happily do so if I can isolate the problem and see you take measures to fix it.

Saturday, December 24, 2016

NYC parking app - An idea

I've had an idea bouncing around in my head for several months since I moved to NYC- a mobile app to give you a notification to move your car when it is no longer legal to leave your car parked where it is.



Street parking is underserved by technology


Background


No rants here about tight spots, inconsiderate drivers, or the difficulty of finding spots. Some of those can be influenced by technology, some cannot, but they are not the problem I am trying to address I want to address the legal implications of where you park.

Parking restrictions take these forms, which I am ranking by my own sense of priority based on how widespread they are
  • Alternate Side Parking, to my mind by far the biggest culprit, where an otherwise perfectly legal spot in any residential area becomes illegal for a short period of time for street cleaning twice a week.
  • School zones, which are fairly easy to identify and avoid, but sometimes they are the only place to park late at night and require you to move your car by 7am next morning.
  • Loading/unloading zones or commercial vehicle zones, also a problem for the same reasons as school zones.
  • No parking or no standing zones, in which it is usually extremely obvious that you cannot park there, are the lowest priority for this app to cover. This would take the form of an instantaneous notification to move your car.
I've been here for about 6 months, being accustomed to the lax parking mentality of Texas and Ohio where every business has its own lot and I've never even dreamed of getting a ticket for anything. After getting several tickets here, I've just about figured out how to remember to move my car (and also my fiancee's car sometimes) in time to avoid it happening again, but I still forget sometimes. Maybe the natives have licked this problem by making mental connections that are not obvious to me, but I reckon a lot of people do forget just like me.

The state of parking apps


Where's the app that will help with this? There do seem to be a plethora of parking apps for Android in general, However, in virtually all cases, all that they do is point you in the direction of paid parking garages and lots, which is I suppose is helpful in certain cities particularly for visitors who don't know where street parking is available or safe.

But New Yorkers tell me that they are parking on the street quite often. They don't have a ticket from a garage, and they need to remember every single day where their car is to move it or not depending on where they parked. It is harder to do it in this city than in any other I've lived in, but to put things in perspective, I have no complaints against the City of New York for their street cleaning program or for the tickets they've given me- I've watched the cycle of loose litter accumulating, with homeowners and property managers and corner store owners steadily sweeping the clutter from their portion into the street and awaiting its disposal. It's just how we do things. If we didn't clean the streets, this city would drown in trash, and if people didn't move their cars, we couldn't clean the streets. My intent is to render the regulations more effective, and hopefully save some people some money who need it.

If you drive to work on the weekdays, then you're probably OK most of the time, since street cleaning is always during ordinary business hours. But I work from home. And even if I didn't, I might come into work at different hours, and I would therefore need to know if my car had to move by 8:30 or 11:30. I can see using this app all the time even with changes to my job situation.


Where is the data?


Locals that I've discussed this idea with agree that it seems great, and cannot name any other app that would do the same thing. In my own searches on the Play Store, I cannot find anything that purports to do the same either.

So the data must be hidden, right? There are over a million traffic signs in New York City and it would be very easy for them to not have these data easily available in digital form. Well, they may not be terrifically easy to use, but they are more available than it may appear at first glance.

Interactive map


NYC Department of Transportation has a handy interactive map that shows, among other things, all of the street signs in the city, which is fantastic if you remember where you parked and have time to consult a web browser.



I applaud them for implementing this feature although its utility is limited for the forgetful. Moreover, although it works quite well on a desktop browser, it can be pretty tedious to manually pinch and zoom to find the exact location on the map on your phone.

Programmatically, I can see some difficulty in using this tool for the following reasons.

  • The selection of a region on the graphical map is obligatory; there is no web API to get JSON or XML of a particular sign's contents. Although I will attempt to peruse the webpage for clues, getting data from it might be a brittle approach if the implementation of this map is ever altered.
  • The street sign contents themselves are just stored as strings and put inside the dialog box. We can parse out the times and dates if the city is extremely systematic in their labeling (MON & THURS, TUE & FRI, 9-10:30AM, etc.). However, we need to be prepared for the possibility of inconsistency.
  • Going from your phone's geolocation data to the nearest sign is itself a tricky problem, requiring us to us the polygon drawing tool with increasing radius to find the nearest matching street sign, and thereafter ensure it is not on a different block or on a crossroads. We have the power to deduce the actual street address by geolocation and Google Maps, but the granularity of this map seems a little less precise.

Tabular data



NYC DOT also gives a full list of all the city's parking locations and parking signs in CSV form. These are pretty huge chunks of data. I suppose the user might consent to save a few megs on their device, but I feel the task of finding the adjacent signs based on the location is particularly poorly served by hunting through these CSVs, and so this is not tremendously helpful.

Block-by-Block Lookup


The most useful tool (although still somewhat hackier than an API which I wish they would implement or, if it exists, document) is this webpage listing parking regulations on a block. You look up a block by the combination of these criteria:

  • Borough
  • Street
  • Two cross-streets
To use this, you must convert the geolocation data into a format which allows you to determine the current street, side, and cross streets. Particularly tricky seems to be getting the side of the street, which is north/south, east/west depending on the direction of the road, and we must determine that through knowing the prevailing direction of the street and determining whether your position is further north or west (etc.) than the other side of that road. But I feel that this is a sufficiently soluble problem.

If you have already determined these (and I feel it is probably much easier to turn geolocation data into this street data than it is to determine distance from signs on the interactive map), then, through some hackiness on the web page, you can extract the parking restriction dates and times in a usable format for your app. I did demonstrate it to my own satisfaction and would happily share my JS with anyone who wants to collaborate on this project, but bear in mind we're at an early stage. I'm going to document my progress in future posts.


Enforcement updates


NYC DOT produces a standard list of holidays on which the ASP is suspended, and gives updates via Twitter or email.

Scheduled suspensions are trivial to program for; dynamically updating the rules for enforcement in the app based on unscheduled suspensions is probably not too difficult. I am certain there is some means of systematically parsing data out of tweets, especially when the DOT seems to be extremely consistent in the text of its messages on suspensions.

Saturday, November 12, 2016

First Ladies

I've heard it reported on CNN that Melania Trump will become "just the second foreign-born First Lady in US history."

Once I read that, and realized I didn't know the answer right off the bat, I knew the gauntlet was thrown down. No Google, I'm quarantining myself from outside sources and I'm going to list all of the First Ladies I know and try to guess.
  • Martha Washington, from an aristocratic planting family in Virginia. Probably was born there too.
  • Abigail Adams
  • Thomas Jefferson had no First Lady.
  • Dolly Madison
  • What was Monroe's wife's name... I have not a clue.
The Founding Fathers' wives were, IIRC, all American citizens at the time of the Constitution's signing. I'm excluding 1-5.
  • JQ Adams's wife, I could see possibly being foreign-born. He was very erudite and well-traveled. I know that he was the Minister to Russia, and he might have also been minister to other countries. His son Charles Francis Adams was the US ambassador to the UK during the Civil War, during which he kept the UK's position effectively neutral with extraordinary finesse.
  • Rachel Jackson was almost surely a homespun American woman, Old Hickory likes them that way only. Her untimely demise was a tragedy for the incoming President.
  • Van Buren's wife was probably American. I do not know anything about her though.
  • WH Harrison's wife... no clue.
  • John Tyler, I believe, had more than one wife. Probably from Virginia like him. He was very prolific in making children too, producing his last child in his 70s. There are still living grandchildren of President Tyler, a man who was born in the 18th century, because apparently Tyler family males never stop making babies until they die.
  • James Knox Polk was from TN or KY. Unlikely.
  • Zachary Taylor was, I believe, a Virginian. Unlikely that his wife was foreign.
  • Fillmore? For some reason this seems possible to me.
  • Franklin Pierce, I did remember reading about his family life. Apparently he and his wife were both witness to their son dying in a train accident, which traumatized them both and contributed to his indecisiveness as president. I do not recall his wife being foreign-born, so I am doubtful that she was.
  • James Buchanan, "Old Public Functionary," was famous as a bachelor. Some historians believe he was gay, but I'm uncertain.
  • Mary Todd Lincoln was definitely not foreign-born, she was from an aristocratic slaveholding family, from Kentucky I think.
  • Andrew Johnson from Tennessee probably didn't have a foreign-born wife.
  • Julia Grant was not foreign-born. I'm unsure of her family background, but she and US Grant did put up with a lot when he was struggling between the wars to find steady work.
  • Lucy Hayes (I remember her name because she was a temperance advocate, and called "Lemonade Lucy" because she refused to serve alcohol in the White House) was American-born.
  • James Garfield's wife, I am not sure. Garfield was one of the least wealthy men to become president, was a minister and officer and teacher, and so as an eclectic man of his time may have wed a foreign-born immigrant to the United States.
  • Chester Arthur's wife, I know nothing of her. President Arthur was apparently very productive during working hours but never worked late, so he probably enjoyed spending time with her.
  • Grover Cleveland admitted to fathering an illegitimate child, but was his wife foreign-born? I seem to remember he was married in the White House and she was younger than him and maybe she is the one.
  • Benjamin Harrison, from Indiana, probably not his wife.
  • McKinley of Ohio, not him.
  • TR's wife, why do I know nothing of her? I've read the man's autobiography and he barely talks about anyone but himself.
  • Taft's wife (maybe named Helen) was probably born in Ohio like him.
  • Woodrow Wilson's wife, whose name I believe was Edith, was a very important First Lady because she effectively took control of the US government in the wake of Woodrow's stroke. You go, girl!
  • Warren Harding's wife... probably not.
  • Calvin Coolidge's wife, no way.
  • Herbert Hoover's wife... I do not know either way, but for some reason I'm going to say possibly.
It gets much easier with post-WWII presidencies.
  • Eleanor Roosevelt was American-born, I am fairly sure.
  • Bess Truman was, like Harry, a native of Missouri.
  • Mamie Eisenhower was, I believe, also American-born. I have read some of Ike's autobiography, but I found it pretty dry.
  • Jacqueline Kennedy was American-born.
  • Lady Bird Johnson was definitely a Texan.
  • Pat Nixon was definitely American.
  • Betty Ford, no.
  • Oh crap, I don't know Jimmy Carter's wife's name! Sorry. But I imagine you're from Georgia like him.
  • Nancy Reagan, no.
  • Barbara Bush, no.
  • Hillary Clinton... I think we know a little bit about her than most First Ladies.
  • Laura Bush, no.
  • Michelle Obama... for some reason I don't know where she is from, but I'm almost certain it's US.
So having listed all of them, I have to narrow down to one guess. My shortlist of possibles is Cleveland, Garfield, Quincy Adams, TR, Hoover.

Beyond that I have no clue, but I tried my best. I'm going to guess John Quincy Adams's wife (not sure on the name), because I know that he had a deep tie to Great Britain, his son ended up as the ambassador to Britain, so maybe she was British.

-------

Checking Google... holy crap, I was right (except I didn't know her name)! The first foreign-born First Lady was Louisa Adams (1776-1852) who wed John Quincy Adams in 1797, when his father was second President of the United States. John Adams was at first hesitant to having a foreign-born daughter-in-law, but eventually he warmed to her. She was unfortunately in ill health frequently, had many miscarriages, and was depressed during her time in the White House.

Corrections after the fact-
  • Rosalynn Carter. Duh.
  • Rachel Jackson was actually never First Lady, since she died before he was inaugurated. Jackson had two First Ladies in office.
  • Woodrow Wilson did have a First Lady named Edith who was the one I remembered, but his first wife Ellen died in 1914 with him in office. He married Edith in 1915.
  • Like Wilson, Cleveland also married in office, and she was a good deal younger than him.
  • James Monroe's wife was Elizabeth.
  • Dolley Madison is the name of James Madison's First Lady. Pardon my spelling; I seem to have been confused by the existence of Dolly Madison baked goods!

Sunday, October 16, 2016

Dave Theurer's nightmares

A rational nightmare


One video game designer, Dave Theurer consistently had intense nightmares about nuclear war ending the world as we knew it.

I feel that I know his mind somewhat, since I used to have nightmares of nuclear war brought on by the brinkmanship of the Cold War between the US and USSR. At any point in time, both of these ideologically opposed powers had thousands of missiles targeted at the other. This made for an uneasy peace, with lower-intensity wars fought by proxy nations and client states to secure geopolitical dominance. Nukes were effective at keeping the peace because the reality of total world destruction was too much to swallow for anyone in power on either side of the Atlantic. Sanity prevailed in the 20th century. You all know the end result- we survived.

But still, the party leaders at the Kremlin operated in an opaque system of suspicion and intrigue, which left Americans on edge continuously. On the other hand, some Americans were quite unnerved by their own elected leaders' rhetoric of "rollback" for exactly the same reason. It did not necessarily pervade every aspect of American life, but it was to be found in a variety of artistic and cultural outlets. I feel that I somehow understood and empathized with this.

The only problem? I was born in 1988. It turns out that I was a generation too late and I had simply been watching The Day After and Red Dawn too many times. My fears were outdated, but my understanding of politics was immature at the time. In fact, at no self-aware point in my life should I have logically feared armageddon through bombs.

Going back one generation, to those who were young men and women in the late 1970s, the fear of nuclear war seems quite rational.

Missile Command was the product of turning nuclear war into a video game. As a concept goes, that seems quite bold and incisive- one would expect some kind of social or political commentary. Nope- it was at least two video game generations too early for that kind of advanced game element. The backstory is just the minimum necessary to describe the gameplay, and no more.

Did the kids mind? I bet that over 90% of the quarters shoved into Missile Command coin-ops in the 1980s were spent by kids who couldn't grasp the geopolitics behind the war that was being portrayed in the game. They saw it as a simple challenge to get as far as they could before defeat. The same basic goal as almost every game of the era.

Did the parents mind? Even if they were paying attention to such details, the game players were trying to defend the US from incoming Soviet missiles, so it was not controversial. The Cold War had been simmering for long enough by the late 1970s that American popular culture permitted some trivialization of the horrible outcome.

You can't "win"


I'd like to add some backstory to the early arcade game market.

People did not generally own home video game systems in the 1970s, except for the Pong consoles. All of the big money was in arcades, where games were coin-operated. Profit for the arcade operators came in the form of a delicate balancing act between a game that was high-quality enough to be played continuously by patrons, and yet unforgiving enough to kill them off at a high enough rate of attrition to keep unique players coming, and more quarters rolling in.

No doubt many ingenious people throughout the decades have been tasked with researching the effects of certain gameplay mechanics on the outcome of the game's financial success. When applied to arcade games, the only metric of interest is how rapidly the machine earns quarters.

There were three basic tactics to making a successful arcade game.
  • High-concept gameplay mechanism (more unique players)
  • Gameplay which is progressively harder (more repeat players)
  • No end to the game (more repeat players)
The first tactic is obvious: making the game simple and immediately relatable. The gameplay must engage as many unique users as possible, and avoid offending or annoying many players. This meant a few safe routes could be staked out: sports simulations, whatever was "cool" at the time, or taking the Pac-Man route and making a cute mascot and approachable gameplay.

The second and third tactics are complementary: if the game never ends but is easy, then a player can dominate the machine and decrease the number of quarters you get. If the gameplay is progressively harder but an ending point exists, then players would seek to beat the game, and then move on. Both of these outcomes will limit the number of quarters your machine earns, so you must implement both tactics simultaneously.

These tactics work at odds to some extent- if you make the game too hard right off the bat and kill people off too quickly, you'll find a very substantial portion of the total players won't waste quarters on it. This may be considered an acceptable tradeoff if you want to specialize in the diehards. If you make a game so lackadaisical and childish that serious gamers avoid it entirely, you can dial down the difficulty to make it appealing to a different group, since you probably won't have very dominant players in that group.

Games were generally not "winnable" in the sense that they are today. If you set an arbitrary standard for how long a game lasts or how many points exist in a game, you can win a game of tennis. But tennis without rules has no end. Pong doesn't supply you with a limit, so you will never "win" the game no matter how good you are. The challenge is to keep up the progress continuously and get the highest score before you are defeated by the game. The game will always win, but your score is a measure of how well you resisted the inevitable defeat.

Infinite games had inherent replay value simply for the high score. Any finite quest would be eventually beaten, and from that point, the winner would have nothing left to discover. A high score might also exist, but since the quest was ended, it would be less important.

Since finite games will not generate enough repeat gamers, infinite games (or practically infinite games) dominated this early period. It also produced some heroically gifted gamers who made high scores much higher than the game designers probably intended. However, in the end, even if simply by voluntarily quitting, the best video game player in the world would lose the game and be given a final score.

You would always lose. You cannot win. It was probably not thought of philosophically, but I have sometimes thought it is depressing.

Missile Command- an inimitable product of its time


I cannot get inside the mind of Theurer, but it seems that his fears of nuclear war were certainly not given any sense of closure by the design of Missile Command.

The most hopeful part is that missile defense even exists. The reality is that the totally fake Strategic Defense Initiative would fail to destroy any of the incoming Soviet attack; all the money in the world couldn't have built SDI in the 1980s. Only in the 21st century have we gotten closer to the idea of effective missile defense on even the theater scale, let alone the strategic scale. But in the game, unlike the reality, here you have the power to save a small portion of humanity by shooting down nuclear missiles manually with your wits and guts.

You are the only operator of a local missile defense system for six cities on the California seaboard from north to south: Eureka, San Francisco, San Luis Obispo, Santa Barbara, Los Angeles, and San Diego. Maybe this is all that remains of the world and the rest is already destroyed. Who knows? No time to think- there is very little respite time between the waves of missiles.

You can regain cities by extra points, but the missiles will never stop coming. The best players in the world have gotten millions of points and played for hours on end, but they have never won. You can't win nuclear war.

In contrast to the usual "Game Over" screen that accompanies defeat in a video game, Missile Command showed a giant flashing fireball slowly creeping over the whole screen, which was then overlaid with "The End." No ego-soothing result here: when you lose, it's the end of the world.

Tempest- an endless onslaught of bad guys


Theurer had another nightmare: an endless succession of monsters creeping out of a hole in the ground from an unknown source. You may choose to call it Hell or anything you wish. While this could be visualized as frightening (and it certainly has been by yours truly), it's also another setting which can create hope by a judicious application of gutsy firepower.

Blast away all the monsters as they climb up the walls. Do so even as they scurry back and forth. When available, use your Superzapper to eliminate all onscreen enemies at once. But, as with Missile Command, you are inevitably doomed to die at some point.

Tempest is one of those games with a visual style that's not really reproduced with modern technology. The QuadraScan vector graphics experience can't be compared to anything else. Every pixel is purposeful. It's mesmerizing when you can just barely see the enemy creeping upwards and you are swiveling the twist-top controller like a madman firing at barely visible foes. Then, when you finally clear the screen, and your ship goes flying toward the center of the screen and you're firing manically away at the green spike that is about to destroy you, and all the vector-generated walls grow in size, you realize you're seeing a depth of field which games of the time didn't have, and wouldn't be common until much later. The "depth" of the graphics, if there is such a term which has any meaning, has somehow always impressed me, even though the graphics are now well over 30 years old. I can't explain it. Tempest vector graphics have a crispness and other-wordly quality that still blow me away every time.

The sound effects were also thrilling and daunting. Firing was satisfying, and the fusillade of blasts from your Superzapper was always a delight.

Controls were simply perfect. Your ship had great precision in firing and motion. It felt much tighter and faster than Asteroids. Although it is a good thing that the controls had no lag whatsoever, you were expected to move faster than in many arcade games of the time to make anything like reasonable progress. You must maneuver accurately over a much wider range of motion than Breakout or Kaboom. Your motion is not linear or circular- you are moving along the surface of a geometric shape of increasing complexity as the levels advance, and you must account for the motion of your projectiles along the walls of this geometric shape. The claustrophobic environment which walls you in (a hallmark of "tube shooters") heightens the tension.

A certain frenzied desperation takes over your actions- some people thrive, some people just fail. Some games like Space Invaders just don't give me that feeling at all, others like Missile Command give it to me somewhat. No game does it as fully as Tempest.

My heart pounds and I beg my hands to move more quickly and precisely, but they don't oblige. I watch myself play, get dazzled by the gameplay, fail to find any opportunities for improvement, and wish that I simply had what it took to play better. These symptoms amount to a unique experience for me which I shall call it the "Tempest panic", which I can only characterize as sincere terror mixed with a bit of delight.

I feel like it's the worthy representation of an intriguing nightmare.

Wednesday, November 19, 2014

Joke of the Day #27

Dan: What do you call really dysfunctional scanner software?
Stan: Not sure. Does it use the most common API in the industry?
Dan: Yes, albeit badly.
Stan: A TWAIN wreck.

Friday, October 10, 2014

Joke of the Day #26

Ann: I'm just looking at this Visual Basic code for the first time. Seems like they declare all variables with the preface "Dim." What does Dim mean? Is it an acronym?
Fran: Does it matter?
Ann: Maybe not, but I thought I'd ask.
Fran: And I thought I'd tell you.
Ann: So if you're going to tell me, then just tell me. What does it stand for?
Fran: Does it matter.
Ann: I don't know-
Together: THIRD BASE!

The wrong war, at the wrong place, at the wrong time, with the wrong enemy

Anti-war politicians need catchy slogans. After all, nations since the 20th century have generally thrown all of their might into a war, using propaganda and misleading evidence for launching it.

In case you wondered about the origin of this expression, it was actually a statement by General of the Army Omar Bradley in 1951 referring to an enlargement of the Korean War into Communist China. Full context:

I am under no illusion that our present strategy of using means short of total war to achieve our ends and oppose communism is a guarantee that a world war will not be thrust upon us. But a policy of patience and determination without provoking a world war, while we improve our military power, is one which we believe we must continue to follow…. 
Under present circumstances, we have recommended against enlarging the war from Korea to also include Red China. The course of action often described as a limited war with Red China would increase the risk we are taking by engaging too much of our power in an area that is not the critical strategic prize. 
Red China is not the powerful nation seeking to dominate the world. Frankly, in the opinion of the Joint Chiefs of Staff, this strategy would involve us in the wrong war, at the wrong place, at the wrong time, and with the wrong enemy. 
From testimony before the Senate Committees on Armed Services and Foreign Relations, May 15, 1951.—Military Situation in the Far East, hearings, 82d Congress, 1st session, part 2, p. 732 (1951).

Omar Bradley official portrait.
Original image from Wikimeda- public domain.
Bradley was not generally an anti-war person.... he had, after all, commanded Twelfth Army Group in its invasion of Germany from the West in 1944-1945. General Bradley had commanded over 1.3 million troops, more than any field commander in US history. He was later the first Chairman of the Joint Chiefs of Staff under a unified Department of Defense. Yet restraint and patience and humility came to him very easily, and he was as unimpressed with MacArthur's nuclear braggadocio as he had been with Patton's antics in Europe. When MacArthur was fired by Truman for insubordination, Bradley fully supported the President.

American history books do not generally discuss all the motivations that were present during the Korean conflict. The United States, before WWII the world's most politically isolated major power, had fought and won WWII with such aplomb that it was thrust into a position of uneasy authority. Suddenly the US had to fight war on a large scale after four years of devastating defense budget cuts, an almost totally demobilized Army, and with no desire to spend the blood and treasure a second time to build one. Under these circumstances, and with a grasp of what it would take to win the Cold War in the long term, Bradley made such an unqualified stand against expansion of the war very eloquently, with compelling reasons.




  • The wrong war
    • At the time, it was still felt that the Korean War was a feint that was ultimately instigated by Stalin in order to divert the attention of the Western powers so that he could prepare for the invasion of Europe that had been feared since the Berlin Blockade in 1948. The Soviets had largely not demilitarized as the United States had, and the West was pitifully weak without the nuclear advantage. Even so, Allied planners reckoned that if the Red Army mounted an invasion, it would be irresistible. Most plans expected a total defeat in continental Europe, with a long and terrible war to be fought from the British Isles as it had been in the previous war.
  • The wrong place
    • Getting involved in a land war in Asia is usually a hopeless mess. Korea was particularly bad because its winters were bitterly cold and mountainous terrain made tanks virtually useless. However, although fighting for the democratic South Koreans had merit, carrying the war into Chinese territory to win the Cold War, with hindsight, would have been a bloody catastrophe for the United States and China, and it would have set back the region many years.
  • The wrong time
    • The United States was at its lowest level of peacetime readiness since before WWII. During the period 1945-1948, demobilization rapidly brought down the size of the US Army to about 600,000 soldiers, thinly spread throughout the world. The United States was not prepared to casually embark on an invasion of China.
  • The wrong enemy
    • China was clearly a regional power only. They were incapable of harming United States interests except for threats to Formosa and Korea. They chose the easier option. By withdrawing in 1953, and permitting the division of Korea, the United States probably got the best deal it could have gotten, since insisting on full democratization of the peninsula would be fought by the Chinese. Compared to the USSR, the People's Republic was not interested in exporting revolution, and Bradley was prescient in predicting that relations with China would eventually cool down.
After the limited Korean War wound down in 1953, and we started to savor the postwar prosperity, the wisdom that Bradley displayed quickly became appreciated. Jack Kennedy completely aped the expression for a 1960 speech in which he suggested that he would not be drawn into war in Vietnam.

But the most famous use has been in referring to the Iraq War. Now-retired USMC General Anthony Zinni used the phrase in referring to the Iraq War's preparation bungles. John Kerry, later in 2004, used the expression and became the first major American presidential candidate to unequivocally denounce a currently existing state of war involving the United States. This was the single-most controversial issue in US politics at the time, but the American people were largely put off by the unqualified pessimism of Kerry, and he lost badly in the 2004 election.

There is some merit to some of the objections to the Vietnam and Iraq Wars. But, without revealing my opinions either way, Bradley's usage was more unambiguous and less controversial than any of the others.

Saturday, September 6, 2014

The computers of my life

I was born in 1988. My firsthand experience with computers of the 20th century was sufficient to convince me that I liked them. After 2005, I began to increase my knowledge of them, but I never dreamed that I would work with software for a living- however, that is what I do today.

This will not be a rigorously researched post- just my own experience and recollections.


A good introduction: Compaq Presario (1993)


A representative Compaq all-in-one that is similar to ours.

My father bought a brand new Compaq Presario all-in-one computer in 1993. The model number was 425 or 433. I have seen lots of screenshots of the shape so that I am very certain it is one of those models, but between the two there are only minor outside differences. There are no surviving close-up pictures of this PC, although we had it well into the 21st century, so I am not able to confidently state which model it was.

This was not really a cheap PC in the vein of Commodore, but it was not an IBM either. Compaq aimed squarely at the heart of the no-nonsense middle-class American family which wanted a good balance of quality, compatibility, business app potential, and gaming possibilities. For its time, it was reasonably effective at everything it tried to do, and making the system all-in-one (with the monitor mounted inside the case for the computer itself) must have made sense at the time for simplicity.

Fairly sturdy, US-made, with a lengthy instruction manual, with the latest x86 processor and Windows, and a price tag of over $1000, it was not a bad choice for a middle-class US family.

When you experience a thing in your very early years, the most poignant memories are of the basic senses. Since smell and taste don't really enter into computing, sound and touch and sight dominate. For this particular computer, sound was a lovably inimitable aspect.

Push the button to power on, and you were treated with a sequence of miniscule clicks so characteristic that I shall never forget what it sounds like. During the boot process, the hard disk drive made a series of VERY precise clicks and whirs that have likewise entered my permanent memory. There was then a very mellow "beep beep" from the machine- the pitch and timing of those two beeps are also perfectly stored in my memory after all the years. The RAM count finished at 4096 KB, and then it proceeded to launch Windows 3.1.

When the machine was underway, the hard drive was marvelously melodious. Clicks and whirs easily demonstrated when the task at hand required disk access. I assumed, as a child, that it meant the computer had to think extra hard and really gird its loins to solve the request I had just given it. For this reason, I was not overly impatient during periods of slowdown.

To use a contrasting metaphor, a silent computer which is not very fast is infuriating. It is like a listener who exhibits no verbal or nonverbal cues of understanding as you speak. Maybe the listener has understood, but you have no indication of it. If the person is slow, or fails to complete your request, it is particularly exasperating, as though the machine is not listening.

A noisy computer is much more relatable, even if slow. It is like a slow person who gives you lots of cues that he has understood what you are saying, and seems to show real effort in thinking of a response. It seems silly to say all this, but the mere fact that the computer made clicks and whirs at the right times elicited a great deal of mechanical sympathy from me that I did not give to any other machine at this point in my life. However, I can overcome the desire for noisy feedback if the machine is fast enough (as almost all modern machines are), just as in the real world I would forgive someone for not providing filler like "yeah" and "mmhmm" while listening if they got to the matter right away and demonstrated understanding.

The tactile sensations were also superior. I was greatly fond of the two-button mouse and solid keyboard. The only standard for comparison I had in the same period was the Apple II that my school used in great quantities. I did not like Apple's operating system then (nor do I today), and I particularly hated the one-button mouse and chintzy keyboard. That old Compaq keyboard and mouse served as an exemplar of what these peripherals ought to feel like.

As far as the sensation of sight.... it was no different from any other machine of its generation. All Windows machines looked the same. At the time, 32-bit apps mingled alongside 16-bit apps. Most games I played were highly rudimentary, for which 256 colors would have been overkill. Windows 3.1 was a fine operating system for a child to learn on, and I still have fond memories of it for that, although I think I would struggle to use it practically in the real world today.

But the sound and touch of the machine made me feel as though it had a presence of life which was unlike any other computer I had ever used. It was a machine which I loved.


The coming of sound: Dell running Windows 95


In about 1994, the situations changed a little bit in my family. There was a divorce and job shuffling. The Compaq remained in the household in which I lived, and it remained perfectly serviceable. In fact, my brother and I grew so attached to it that it was rarely available for the practical needs of working and studying adults. This entered into the decision to purchase a second home computer for my family.

My mother decided to buy a new machine in 1995 or 1996. The fuzzy details of this period of my life preclude remembering any more detail than that. I do not know what model of Dell it was, but I do know that it ran Windows 95.

I was extremely curious about her new computer because it had speakers. We had no sound on the old Compaq except the disk drive's charming little symphony of clicks and whirs. Therefore, to my mind there was no startup sound- in fact, I did not know until years later that a "ta-da" sound was supposed to be playing on Windows 3.1 startup! It was never played to my ears because the Compaq had no speakers except for system beeps and blips.

Imagine my awe and delight when she turned on her PC for the first time in my presence and it played this beautiful melody.



I was spellbound by the fact that it could play music, for all my experience with music up to this point was in the form of the radio or cassette tapes, and they seemed so limited. If a computer could generate music, it would be capable of imitating musical instruments.

However, this was all I ever heard. My mom mostly used the PC for work purposes, and I was never permitted to use it. She did play games occasionally, but she was a fan of the text-based RPG style of gameplay, and there was no potential for audio.

Still, I heard that damn Windows 95 startup sound many, many times.

In later years, I would revisit Windows 95. It is, by the standards of its day, one of the most groundbreaking operating systems ever made. So many features were newly introduced. It eliminated the Program Manager paradigm of Windows 1.0 through 3.1. Though it was still built on DOS (and 98 and Me would be, too), Windows 95 was a GUI that was entirely different, feeling far closer to today's operating systems than to all operating systems that preceded it.

Stability was not a hallmark of these machines, but I am told that earlier versions of Windows were not either. If you shut down improperly, or fail to restart after applying changes, you can ruin the current configuration of your machine rather easily. Crashiness is an issue, and it remained so in Windows 98. Windows Me, of course, is so terrible that it doesn't bear talking about.

Elementary School Giveaway: IBM PC AT


IBM PC AT (courtesy of Vintage Computer)

My brother, somewhat older than me, was always substantially more adept with computers until we were very close to adulthood. I tried to follow what he did, but sometimes I just wasn't interested.

This was one of those cases.

In 1997 or 1998, someone at his school (he was two years older than me, so he went to different school) had tipped him off that the school would be disposing of relatively obsolescent PCs which the school intended to replace. Grandma dropped him off; I did not go with him to pick out a machine. Free computers, and I didn't even go! Was I insane? What on Earth was I doing that evening?

He selected an IBM machine whose model we still cannot agree on. We agree on only two things:
  • It ran Windows 3.1.
  • It had those vents on the front that characterized most (but not all) 80s PCs from IBM
I think it is very unlikely that it was as recent as a PS/2, since that didn't have the front vents, and these were generally not yet old enough to be so obsolete that an elementary school would give them away.

However, could an earlier IBM machine even run Windows 3.1? There is no way that a 5150 or an XT is capable of running the most advanced 16-bit operating system ever made by Microsoft. The XT usually shipped with PC-DOS, although it is certainly capable of running Windows 1.0. But 3.1? No way.

However, the AT, launched in 1984, meets the minimal requirements of running Windows 3.1, as long as it has enough RAM. They definitely shipped ATs with as little as 256K RAM, but it was expandable to up to 16 MB, which is far more than the minimum 1 MB needed to run 3.1. It did, however have an 80286 processor, which was at least two x86 generations behind the current processor line when Windows 3.1 was launched. In other words, although it would work, it would be slower than the pitch drop experiment.

This accords well with my memory of the machine. It was notoriously laggy and unable to respond quickly to any request. This machine yearned to be saddled with something less bulky. But we got sick of that tired old IBM's slow ways, and tossed it out ourselves after some amount of time.

Apartment Dumpster Fodder: Tandy 1000

Tandy 1000 (old-computers.com)

In the same vein, my brother again got an interesting find of opportunity. It was a computer that one of our neighbors intended to throw away. Not wanting to see it thrown out, he brought it back home.

We are in agreement about which model this was. It was a Tandy 1000 for sure. Not the older TRS-80, this was a popular competitor in the IBM PC compatible segment. By the end of the 80s, Tandy was still doing well.

This machine probably had an even slower processor than the IBM, as most Tandy 1000s shipped with an 8086 like the original IBM PC. The 1000 SX came with an 8088 processor, like the IBM PC XT. While the 1000 TX was equipped with a 286 processor, it was still in a class below the IBM PC AT in other fields of performance. Tandy computers were sold through the then-vibrant Radio Shack retail network. Good prices and decent PC compatibility were strong selling points. Performance was adequate for the day, but nothing spectacular. They were a solid choice for families or educational institutions.

However, I have no memory of slowness with this machine, because it had no graphical operating system. It almost certainly was running MS-DOS, but I cannot comment on which version. It did not even have a 3.5" diskette drive, instead relying on 5.25" floppies.

This was a machine which we bought for nothing, had little to no support for, and chucked it out when we ran out of ideas to use it for. It took maybe two weeks.

"Floppy Disk" Nomenclature


I will take a brief moment to discuss what I think a "floppy disk" is.

A floppy disk is the very large (and today rare) 8" disk or the very common 5.25" disk that was used during the late 1970s and through the 1980s. These disks were called floppy because they actually were- the disks were magnetic film coated in a thin layer of bendable plastic. They lacked structural rigidity, but the upside was that they were very thin and could be stored densely. The range of storage available on the 5.25" version had anywhere from 200 kB to over 1 MB storage. Even if they were branded as "diskettes" at the time, calling them floppy disks was popular because it was a useful descriptive name.

The 3.5" disk replacement came to dominate soon thereafter. In addition to the useful improvement in storage space to 1.44 MB (or 2 MB in later versions), diskettes had hard plastic shells that made them far more durable than the old floppy disks.

Around 1990, it became useful to differentiate between what was meant by a "floppy disk", so the term "floppy disk" was used to describe the older disks, and the branded term "diskette" was used to refer to the newer hard disks. The differentiation remained relevant until the mid 1990s, when the old 5.25" disks were totally obsolete. This is the environment in which I grew up. I remember the distinctions in the terminology very well. I never once called these 3.5" disks "floppy disks" because they simply were not floppy. I do not remember them being advertised as such.

It was only after the 5.25" drives totally disappeared from the market that the meaning became muddled. There came a point in the late 1990s when I heard people referring to them as floppy disks, and I came to grips with how the shifting sands of the English language had produced ambiguous terminology simply because of technological obsolescence.

Still, to this day, I do not believe the ubiquitous 3.5" disks should be called "floppy disks," and I feel strongly enough to write about it.

Summary:

  • True floppy: 8" disk introduced in early 1970s. 
    • Call it a floppy disk to be clear. 
    • Use the size in inches to be more clear.
    • Use the size in kilobytes to be unambiguous.
  • True floppy: 5.25" disk introduced in the late 1970s, and popular until the late 1980s.
    • Call it a floppy disk to be clear.
    • Use the size in inches to be more clear.
    • Use the size in kilobytes to be unambiguous.
  • Not a true floppy: 3.5" disk, introduced in  1987, and popular even into the early 21st century.
    • Call it a diskette to be clear.
    • Use the size in inches to be more clear (although 3.5" is the overwhelming standard)
    •  Use the size in kilobytes or megabytes to be unambiguous.


Lemon: eMachines from 2000 (running Windows 2000)


My dad bought this computer while my brother and I were out of town over the Christmas break with mom. So we came back home to be greeted with a new computer- the first one we personally experienced since 1993. I was thrilled from the very beginning.

We got off to a good start with the operating system. Windows 2000 was no longer based on DOS, and was moved to use the NT kernel that higher-end Windows systems had been using since 1993. As a result of those changes and constant development, Windows 2000 attained very high stability relative to its predecessors. On the surface it looked extremely similar, with few interface changes from earlier Windows 95 and 98, but it had some good changes underneath. Windows 2000 is still well-regarded by some minimalist computer enthusiasts to this day. Comparing it to the critically panned Windows Me, released around the same time, is like comparing night and day.

However, the hardware side would let it down severely. If you were born before the mid-90s, you may have some memory of the reputation of eMachines. If you were born in the 21st century, you may never have realized, but the early eMachines computer was the Yugo or Edsel of its day.

In 1998, the US market was dominated by American manufacturers. At this time, the American computer companies had yet to do much outsourcing: Dell, Compaq, HP, IBM, and even Packard-Bell still -mostly- made their machines in the USA. Quality varied from excellent in the IBM PCs down to mediocre in the Packard-Bells. Some outsourcing existed, but there were few foreign brands.

Surprisingly weak in the US PC market was Japan. The imported PCs from Sony's Vaio range were extremely costly and very high-quality, which has ensured them a very small but loyal fan base throughout the years.  The Japanese were technologically very mature by the mid-80s when PCs started to be made by American manufacturers en masse. Japan did not have low enough wages to undercut the American manufacturers on price as they had previously been able to do with radios, motorcycles, cars, and televisions. By the 1990s, Japan had, perhaps, evolved to such a high level of development that it couldn't profitably crack into the US PC market. There was a lot of room at the bottom, and not much room at the top.

Therefore, it was time for a new Asian challenger in the huge American PC market- the time for South Korea to sell PCs abroad was nearing.

PCs were already a proven market by the mid-late 90s, but the powers that they offered varied hugely. Although American families had become familiar with PCs throughout the 1980s, they had largely bought Commodores and Apples, and the price of entry remained relatively high throughout the 1990s unless you were willing to get a very basic machine. The cheapest x86-equipped machines of 1998 started at about $999 with monitor. For that price, you got a Celeron, inadequate RAM, and probably a CD player, but maybe not a CD burner.

The eMachines PCs started at way less than that. You could get them for as cheap as $399 without monitor. The earliest eMachines models were about as cheap as you could possibly make a functional general-purpose computer with an x86 processor and Windows- these were basically the only two requirements unless they were to be considered niche items. More expensive versions, still cheap at under $800, got you much more equipment and power than competing brands. It was an appealing proposition.

The successor to the 486 was the Pentium, which by 2000 had gone through several updates into the Pentium III. Starting in 1998, the Celeron was released as a cut-rate alternative to the Pentium range. From 1998 to today, they have always had a Celeron equivalent for budget systems. This helped bring the cost of PCs down, but these were notoriously slow with their woefully undersized caches and reduced clock rate.

Still, this was not the Achilles heel of the machine we bought- not by a long shot. That would be the hard disk drive. For sure.

At first the machine was silent- that exact kind of slow, uninterested silence that I detest about computers. No progress was audibly demonstrated. However, occasionally, it would make noises. These were frightening, sharp clackety noises emanating from the hard disk. Not the charming, hopeful noises of the Compaq. These were worrying noises.

This was a machine that was born with asthma, angina, and arthritis.

Sometimes, during a disk read, you could hear that it stuck in some kind of rut, and the drive would spin endlessly making a succession of clicks like a metronome. The system was totally unresponsive at this point, and all work was lost. We undoubtedly hastened the demise of that cursed machine by angrily punching the case when this happened.

Then we suffered a head crash- the drive was wiped out, none of us had the computer knowledge or competence to diagnose the problem. It was a dead machine.

It had only barely functioned for its entire life. The total lifespan before its head crash was no more than eighteen months. Long enough to elude any warranty of the 1990s, but short enough that anyone would be understandably disappointed in their purchase.

This was most definitely not an isolated incident. eMachines had almost uniformly poor customer reviews during this time period. However this was only helpful to consumers if you had access to the internet or to magazines that contained said reviews.

They quickly earned a richly deserved reputation for crappy quality, and wrenched the lowest quality PC accolade from Packard Bell, itself in serious trouble in those days. Neither of those brands is still around today, but once eMachines hit the market, it definitely ensured that Packard Bell was no longer the lowest-quality machine around. This would remain true until the end of both companies.
eMachines was not the first attempt to carve out a niche at the bottom. Packard Bell during the mid-90s had heady success and took the title of industry leader from Compaq. Compaq fought back, and undertook to move their own product downmarket in order to strangle Packard Bell. In addition, they sued Packard Bell successfully for failing to disclose that their PCs contained used parts, even though this was industry standard practice at all firms, including Compaq. Low price brought Packard Bell up, while perceptions of quality spelled its end. eMachines launched into a market without a clear choice in the low-cost computer segment, and they undercut all their competitors on price, earning them immediate success.

The casual computer buyer of 1998 had these constraints which strongly favored cheaper new upstarts like eMachines despite their problems.
  • All the personal computers were relatively pricey, and the gap between the pricey brands and the cheap brands was very large.
  • In a relatively new market, the lifespan of a product was not as well known. People might tolerate a machine only lasting 2 years before dying without dismissing the brand in the future.
  • There were no tablets, and no smartphones worth talking about. If you wanted a computer, you had to buy a desktop. Laptops were substantially more expensive, and were consumed almost entirely by the upper class.
  • If the computer was slow, that was not such a big deal. A slow computer is better than no computer.
The casual computer buyer of 2013 was likely to be in an entirely different situation.
  • Most desktops were as cheap as possible. The cheapest were around $300, and the margin between any two brands at the bottom end was less than $50.  Although eMachines also made laptops, the cheapest laptops were well under $400. This left less room at the bottom of the market for hardcore cost cutting.
  • Desktop PCs were a highly mature market. Quality and reliability were required. If a machine lasted only 2 years, it would be considered a bad product, and hurt the brand perception. Most machines of today have at least a 1 year warranty, and retailers will often offer at least 2 extra years of warranty protection. The machine had to be relatively well equipped for its price, especially compared to earlier PCs.
  • Tablets had gone from rich people's toys to a mature product in just a few years. Although they were less capable than desktops for many purposes, even high end tablets of 2013 were cheaper or around the same price as low-end desktops, leading some people to not buy PCs.
  • Smartphones often come free with a phone plan and contract, so there was usually at least one personal alternate source of internet access and other applications, even for young buyers, leading yet more people to not buy PCs.
What was once the heart of the American market in the 1990s was now merely one piece of the pie alongside other strong competitors. Although the absolute size of the desktop market has only shrunk slightly, it is an extremely unprofitable area of business.

This environment hurt all companies like eMachines which focused on conventional PCs unless they had a backup source of income. IBM left the market entirely, but they remain profitable in other fields. Lenovo bought IBM's former PC business, and they are today the world's biggest desktop PC manufacturer. Dell and HP still make servers and workstations for business customers, so they have some insulation from PC retail sales. While Packard Bell was a casualty of its own bad reputation, coupled with some arguably unfair bad press, this was prior to the decline of the desktop PC. By contrast, the demise of eMachines was a direct result of the decline of the desktop PC.

In short, the desktop market was too cutthroat for a low-end PC specialist brand with a poor reputation for quality to survive, and eMachines did not enter the business of selling tablets and other mobile devices, so it lasted for about as long as could be expected. Still, they stuck around for a number of years in the meantime.


Not quite a lemon: eMachines, XP Home, 2002


So, following the ignominious downfall of the old eMachines, what did my dad replace it with, some months later? Another eMachines.

XP was coming out and what an exciting time it was. As far as color goes, this was the biggest advance in operating system history. Previous Windows releases had had gray menu bars, a gray task bar, with only the odd bit of dark blue to offset the gray.

XP had a blue taskbar and a green start button! Even the other color option for the taskbar, silver, was much more vibrant than previous generation's gray. Every window had a red "X" in the top bar to cancel the window. Window opening and closing was oftentimes accompanied by animations. It was downright pretty.

However, this graphical enhancement took a big toll on the speed of the machine. Although we had a much more capable system this time, XP had higher requirements than previous generations, and it was undoubtedly slow on that eMachines.

I had said that eMachines never quite outlived their reputation for poor quality, but throughout the 2000s they definitely took steps to address it. Very happily for us, eMachines had begun to step up their reliability game. This computer was simply better-built and had better parts than its predecessor. We had this machine for about 4 years before its demise, which is a decent, if unexceptional, lifespan. By 2005, some technology critics believed that eMachines had roughly achieved parity with Dell, the benchmark for its segment. Not everyone believed them, but they still kept buying eMachines because of its price.

I grew to like this machine more than the last, for sure. But I never loved it. It just did not have a quality feel. The keyboard was extremely lousy, with no feedback and mushy keys. It had some crashiness issues, like with previous generations, but this was a family machine shared with my brother, sister, and Dad, so I cannot comment on whether some poor download activity and insecure behavior was going on.

For me, the coming of home internet was the main appeal of this generation of computing. We first became internet customers when we purchased CompuServe in 2001. We would retain this service for the next six years, well after most people had ditched dial-up.

The dial-up experience was somewhat magical, because you were hearing numbers dialed, random sounds being played, and you had no idea what was going on. The fact that the dial-up start sequence was audibly played meant you knew how long you had to wait. It became a welcoming sound. When you heard it, you knew you were entering the broader world of information. To use language that was already laughably outdated in 2001, you were merging onto the information superhighway.

Internet access was, of course, very slow. Additionally, any other person in the house was capable of knocking you offline by picking up the phone receiver. Since these were issues that all people experienced in the days before broadband internet, we simply coped. Text-based webpages were a necessity. I loathed the greeting of a webpage redesign that made it slower to load, but I didn't blame my ISP for the slowness. I just wanted the internet to stay simple.

It is difficult to recall distinctly what my browsing material was as a young teenager. (Hold your snickering, porn was not involved) I remember joining Facebook in 2005 well before most people knew about it (and in advance of my high school friends, who all claimed that they didn't want to join). Prior to that, I just have the fuzziest memories of my early experiences with the internet.

  • CompuServe 2000
    • "Welcome to CompuServe." (inevitably, we called it CrapuServe)
    • "You have mail." (if I was lucky)
  • A webpage with a photograph of Al Gore and George Bush, candidates in the 2000 presidential election. You could stretch and smear their faces using the mouse. For some reason I found it riotously funny.
  • Pre-Google YouTube
  • Jennifer Government Nation-States
  • AOL Instant Messenger (although we had CompuServe IM too, AIM was more popular for a long time afterward)
  • Yahoo Messenger (which still exists)
  • The Grape lady (this was actually in 2006, so I misremembered how early it was)
  • MySpace... enough said.
  • "Thank you for using CompuServe."
The internet for me in the period 2001-2004 was shared with two other siblings. We did not share the internet- we took turns and demanded our privacy when browsing. This made for relatively little internet time, and if either sibling was belligerent, you could be sure that you'd accidentally get knocked offline a few times. Consequently, internet wasn't great in this period.

When my brother went to college in 2004, it got better for all of us. He suddenly had excellent broadband on campus. We suddenly had more internet time. But there was still only one computer (the 1993 Compaq was still in our possession, but just gathering dust at this point).


Rolling my own (2005 to present)


In 2005 I decided I wanted to build my own machine, with my brother's help. He mostly figured it out himself, but I watched him every step of the way, and that would be the only time I would watch someone else.

I discovered a site called newegg, which I still love to this day. It was the source of every single part of that first PC. And the next. And the current. Frankly, I only rarely shop around because I am so satisfied with newegg that I can't imagine a competitor beating the whole experience. I have nearly 10 years of flawless experience with ordering, delivery, and longevity of electronics and computer parts I have bought from newegg.

The first build is always special. It was June 2005.

I selected an Athlon 64: socket 939, Venice core, 90 nm feature size, 64+64 kB L1 cache, 512 kB L2 cache. That was a very generous L2 cache for the day! The clock speed of this processor was only 2.0 GHz, which left it far behind the Pentium 4 of the day. But, the AMD was a much more efficient processor, made better use of its cycles, and was both more reliable and ran cooler. Performance was surprisingly good by most metrics, and inevitably it was far cheaper. I would say that during this period Intel had lost their way, with the "megahertz myth" driving their design philosophy. This was a fresh new microprocessor design in 2005, and I was extremely pleased with the performance.

The second build used the same case. This was in 2008 or so. I used my old processor and many of the old parts to build my dad a computer that was suitable for his needs, buying some new parts for my machine in the process. I bought another Athlon, this time dual-core 64 X2: socket AM2, Windsore core, and I do not recall the clock speed. This was a 2006 design from the first generation of AMD multi-core processors, and it did not feel like a huge leap even at the time. I stuck with Windows XP.

This build would last me until 2011, by which time it felt fairly archaic. The cheapola Chinese case lit up like a chintzy aircraft carrier with six blinding blue LEDs, which I considered cool when I was a junior in high school. Everything held together OK, but it didn't feel great. I guess I should feel lucky that the power supply which was thrown in for free with a $45 case lasted for 6 years.

In 2011, I did the third build overall. However, this was the second all-new build. This is the basic system I currently use: I went all the way to a Sandy Bridge Core i5 from Intel. This was the period in which I felt that Intel had reasserted its dominance. The power available in this CPU is incredible. An i7 is just superfluous when the i5 can do so much. More expensive than an AMD, but it was so worth it.

I did not buy a super-cheap case this time. I bought an Antec. Mind you, it was only a Sonata, but the quality of this thing runs rings around my old case. I occasionally wipe down the exterior, and I make it a point to clean out the inside of the machine with an air compressor several times a year. Doing this means that every time I open the case, everything feels fresh and new inside. It's a great case.

I did something that might make some geeks think of me as a sucker: I bought Windows 7 with money. I don't feel guilty about it, since this is the finest thing (for my money) that Microsoft has ever put out. This is my current favorite graphical operating system, and although it is not perfect, it is reaching a very high state of polish which I think will be hard to beat for desktop-only machines. Since Windows 8 and 8.1 were designed with touchscreens in mind, they sacrifice some of the desktop purity, although I have no doubt they are also good systems for what they are intended for. I'm a big fan of desktops, so Windows 7 feels like a high-water mark. Since most businesses with computers for professional use migrated from XP to Windows 7, I think they would tend to agree. Windows 7 is the most widely-used general-purpose computer operating system in the world as of 2014. Somewhere in the "real world" (or maybe just in my corner of it), companies expect software developers to sit down and code on a desktop computer with a separate keyboard. Not a lot of development gets done on tablets. Although I don't doubt that plenty of it does get done on laptops, laptops don't necessarily have touchscreens and I feel it is inappropriate to treat them like big tablets. Maybe I'm echoing outdated concerns here, but this is what I see from my experience. Point is- if Microsoft doesn't plan to take care of its enterprise customers when they seek a robust desktop OS to replace 7, then Windows might no longer be an "automatic choice" for those businesses. This would spell Microsoft's doom unless they suddenly start to dominate in some other field, which they currently don't.

In 2012, I felt that the onboard sound was insufficient, and I bought an Asus 5.1 sound card and installed it into one of my many open PCIe slots. This permitted me to get a better speaker system. I use my desktop as a surrogate stereo, so it was a worthwhile purchase.

By April 2014, more updates were called for, and I had a long run on my original 1.5 TB Seagate disk drive. That drive, I read, was a very poor batch that suffered a huge number of failures in real world use. I decided to augment it with an additional 1 TB HDD. Big deal- HDDs never get appreciably faster, so it was just more storage space.

At around the same time as this, I bought a second monitor. This is a very nice Acer with a quite substantial stand. Having a second monitor is simply wonderful for many everyday tasks, since it means you can still use your PC for other things when an application demands full use of one screen.

Its sturdy, telescoping stand is perfect for portrait-mode display, which I find useful for several reasons.

  • When I am on Linux, there is more command line history and space for logfile display available.
  • When I am in Windows, there is more space for a longer webmail page that shows my whole inbox at any given point in time.
  • For certain spreadsheets, I have very long lists with many rows, but relatively small columns. Portrait mode gives me more onscreen cells of value.

Also around the same time, I sprung for the biggest advancement I have ever personally witnessed in computing technology: the solid-state drive. This is the most satisfying technology purchase I have ever performed.

If you are using a computer which is not equipped with an SSD, I think it behooves you to think of no other purchase related to computers before you make this upgrade. Even a tiny one that is just big enough for the OS and system applications. The difference in speed is mind-blowing. It is far more of a difference than any CPU or RAM change I have experienced. Boot-up time is 6-10 times times faster on my machine, while everything that depends on my OS is done practically instantaneously. There is no longer any noticeable delay in opening, moving, and copying files. If you get an SSD, regardless of the processor you have and the RAM and the video card setup, I suspect the only true bottleneck will be your old-fashioned American internet, unless you're a Google Fiber customer, which I am not.

You might think I'm exaggerating and I'm using too many superlatives. I'm not! In fact, I could really talk about how awesome SSDs are until I am blue in the face. I could write a post just talking about how disk access times are the bane of my existence, and it is impossible for a machine to feel fast without some technology that minimizes them. It does not matter WHAT you are doing, you will see huge advantages from an SSD. Buy one today! Buy one from Samsung, buy one from Intel. Doesn't matter that much, they're all marvelous.

If I were ranking the satisfaction of upgrading to new computer technologies throughout history, the top would not be LCD monitors. Not CD drives. Not graphical operating systems. Not even high-definition video! There is no other advancement quite as thoroughly satisfying in all my 21 years using computers as the SSD.

Thanks to these updates, my current system performs extremely well. The next purchase on the horizon is possibly a more advanced graphics card; I am still using the CPU onboard graphics processing with my dual monitor setup. I suspect that this is part of the reason why my display driver crashes relatively frequently. It is a minor irritation, but an irritation nevertheless.

The future for me does not hold any more pre-built desktop PCs. I will never buy another one. It is not because I do not have fond memories of some of the great machines I have used in the past, but as an adult in software engineering, I feel that the priorities of a PC vendor are different from my needs as a PC buyer. For my own piece of mind, and to ensure that all my parts within are of decent quality, I will purchase and assemble all my home PCs personally. If I must buy another laptop (I have a low-end laptop from 2011), I will not have this luxury. If I must buy a tablet (I currently have none), I will not have this luxury. However, with desktop PCs, you have fewer limits, and can build a fully customized machine that does exactly what you require of it. I find that outcome very satisfying. Perhaps you could give it a try?

A limited-time bonus: Any commenters on this post will get free advice in computer assembly, if they seek it!

The very unique President Garfield


Garfield's Life


Brady-Handy portrait of Garfield, 1870s

James A. Garfield was a man who became President and died shortly thereafter. He does not get much presence in our collective posterity. It is hard for authors to lionize him excessively, because one cannot make a career (or a legacy) on what one was likely to do, or what one intends to do. However, if these are valuable indicators, then they suggest Garfield would have been a great President.

He was a renaissance man of his time, an extremely skilled scholar of Latin and Greek. He was the only President who was a clergyman. Garfield served as an Elder in the Church of Christ, resigning his position only on the occasion of being elected President. Before this, Garfield had been a languages teacher at the Western Reserve Eclectic Institute in Hiram, Ohio.

In the lead-up to the Civil War, Garfield was a passionate anti-slavery orator, and soon allied himself with the fledgling Republic Party. He immediately sought a commission in the Union Army at the outbreak of war. While serving as General Rosecrans's Chief of Staff in the Western Theater, Garfield passed time with Rosecrans discussing religion and philosophy. The two officers dovetailed personally, with Garfield showing unswerving loyalty to Rosecrans, and the latter referring to Garfield as "the first-well read person in the Army." During the war, Garfield's writings demonstrated that he grasped the real aim of the war was to eradicate slavery, referring to the dismal possibility of having a worthy cause if slavery were maintained: "It is hardly possible God will let us succeed while such enormities are practiced."

Elected to Congress in 1862 (against his wishes to remain with the Army), Garfield had a reputation as one of the most captivating speakers on the floor. When he spoke, all listened.

General Garfield was on hand in Washington during Lincoln's assassination in 1865, although he was not present at the event. Two men who had praised Lincoln's assassination were beaten nearly to death in the streets, and there were calls to form to a mob to destroy the offices of newspapers which had been unkind to Lincoln. In a gesture that today seems Hollywood-esque, Garfield literally grabbed a US flag, stepped forward, and calmed the masses with an extemporaneous speech.

"Fellow-citizens,—Clouds and darkness are round about Him. His pavilion is dark waters and thick clouds of the skies. Justice and judgment are the establishment of his throne. Mercy and truth shall go before his face. God reigns, and the government at Washington still lives.

He echoed Biblical grandeur in admonishing the panic and demagoguery, asking that the reality of Lincoln's death be accepted somberly. It was successful. The crowd was hushed, and no mob was formed.

In his personal life, Garfield was a voracious reader, having a personal library of some 3000 books. He was also technically gifted, and is certainly the only US President to publish a mathematical proof, when he submitted a fairly clever trapezoidal method for proving the Pythagorean theorem for the New England Journal of Education in 1876.

Garfield showed intense interest in the civil rights of African-Americans in the wake of the Ku Klux Klan's rise during the 1870s. Although he felt torn over supporting the Ku Klux Klan Act, as it would give the President extremely broad powers (including the power to suspend habeas corpus), he used his inaugural speech to declare that blacks deserved "the full rights of citizenship" and added: "Freedom can never yield its fullness of blessings so long as the law or its administration places the smallest obstacle in the pathway of any virtuous citizen." With strong foresight, he predicted that if the blacks were not educated, nor their ability to vote and exercise their rights freely granted, they would become "America's peasantry." Garfield's predecessor, Rutherford Hayes, had willingly accepted the Presidency in a bargain whereby the Southern states would agree to the Republican victory only if Reconstruction were dismantled. Almost immediately after 1876, blacks were evicted from every level of state government in of the former Confederacy, and they were forced to yield almost every practical right they had previously enjoyed. If not slaves, the freedmen were treated on the same level as serfs. Garfield's personal principles suggest that he intended to roll back the status quo that was being entrenched throughout the South, but he never even had a shot at doing it. It would be many decades before another President emerged with similar principles.


Assassination


Unfortunately, Garfield had only a few short months of Presidential power before he was shot by Charles Guiteau. He would be helpless and bedridden for the rest of his life.

His wound was a single shot to the abdomen that lodged near his pancreas; another bullet grazed his arm but caused no serious damage. The President's medical treatment was sloppy even for the time, with American surgeons disdainful of the notion of surgical cleanliness which was then growing in popularity in Europe, they inserted unsterilized fingers into his abdomen and hastened his demise due to infection. They also incorrectly guessed where the bullet had gone, and probed continuously and unsuccessfully for the slug. If the bullet had been removed with sterile tools, or even if it had simply remained there in Garfield's body without poking and prodding, he likely would have escaped the massive infection that contributed to his death. Eventually, the doctors came to believe that his intestines had been damaged by the bullet's course, and prescribed a very strict diet for the President of egg yolks, bouillon, and whisky- even going so far as to require feeding rectally instead of orally. Thanks to this advice, the President wasted away to practically nothing, losing nearly 100 lb before his death. 

By modern standards, Garfield's wound would be considered miraculously lucky. The autopsy revealed that the bullet did not damage any vital organs. He could have been treated with high confidence by surgeons of today.

As summer wore on, Garfield found no comfort in the oppressive heat and humidity of DC. His doctors suggested that he needed to be moved to a more comfortable and agreeable climate to speed his recovery, so he was loaded into a train for Jersey Shore. The main line went as far as Elberon station in Long Branch, but sympathetic residents had (in a matter of hours) built a new railroad spur that led directly to a seaside cottage. This spared the dying President a bone-shaking carriage ride. Later on, the temporary spur would be dismantled and the ties used to build a small hut known as the "Garfield Tea House." This humble wooden monument to President Garfield's memory still stands today, although in rough condition.

In great pain and close to death one day in September, Garfield sat up in bed to scribble his signed name, adding the Latin phrase strangulatus pro Republica - "tortured for the Republic."

James Garfield died on September 19, 1881 from his injuries, with a contribution from medical maltreatment.


Civil Service Reform


Public outcry following Garfield's shooting and death was directed at the very system that Garfield himself had vigorously opposed: the "spoils system," whereby a newly elected leader or party would expect to replace most of the offices with their own loyal supporters. These government jobs were usually not filled competitively, so they were comparatively easy jobs, and many of them (Collector of the Port of New York, for example) were extremely lucrative for the officeholder. Placing the final decision power on the President's shoulders for filling federal jobs ensured that the President never saw an end to aggressive office-seekers who demanded an audience with him. The enormity of dealing with thousands of would-be appointees meant that a President generally lost weeks or months of work filling appointments.

The solution was a double-edge sword which remains so even today. 

The former system conferred a certain power upon the Presidency to hire and fire appointees with no repercussions, representing a check on the power and size of the federal bureaucracy. The biggest upside was that it kept government accountable for the actions of all of its appointees every 4 years, and it checked the size of the bureaucracy. The downside was that the President was expected to fill favors in this way, opening up opportunities for graft, and wasting time that could be spent in real work.

The system we use today is different in two key ways. One of these is more-or-less universally considered a worthwhile development, while the second is still debated to this day.


  1. The immediate solution to the problem of inept office-holders was to create a competitive system whereby candidates would be evaluated by objective standards set for each position. A Civil Service Examination determined the aptitude of a candidate for government work, so that at least grossly unqualified candidates would not be accepted. This was popular in the 1880s, and it remains so today. The American people do not object to submitting their prospective civil servants to standardized testing and evaluation procedures.
  2. The ultimate solution to the problem of Presidential power over the bureaucracy was to empower the independence of the bureaucracy from the President. In the early days, the President had the power to almost totally clean house in the Executive Branch, and fill many thousands of positions. In the subsequent period, the bureaucrats who entered a career of civil service had their jobs protected from Presidential capriciousness by regulations and the requirement of cause to be given in termination (a situation unlike most private-industry jobs in the US). Civil servants could expect to keep their careers for the long-term, since the very nature of ensuring independence from political whims meant that they had an unmatched degree of job security. This process has made bureaucracy jobs almost impossible to remove once created, and it means that the federal employment rosters will always rise, and the government will always be accused of inefficiencies. The inefficiency is, in most cases, simply required by law. However, it has been vigorously opposed by those who rail against government waste. Especially by individuals who haven't even contemplated what the alternative is.


The President does still have the power to appoint thousands of people, but the bureaucracy has millions of employees in the 21st century, so the President's power is substantially weaker in this area than it once was. Even if the President had the singular goal of cutting down on the bureaucracy, he would not have the legal authority to do much about it. Congress has established the right of the people to decide how their civil servants are hired and retained, except for a very small number of appointees who serve "at the pleasure of the President".


Political Assassination in the American Political Consciousness


On a certain level, the experience of the assassination was substantially more therapeutic than Lincoln's had been. 

In the first assassination, the killer had supported the Confederacy with arguably true principles, had fled successfully, eluded pursuit for days, and was ultimately punished by a Union soldier who shot and killed him, not tried and convicted by a court of law. This did not let the government save face. It did not give the people satisfaction that their system was just, fair, and effective. On the other hand, there was grinding resentment about those who had helped and sheltered Booth, with vastly disproportionate punishment doled out to them by vengeful military courts. In the end, people never knew exactly how much of a conspiracy it was. The death of Lincoln was an incompletely-resolved trauma for the American people- to some extent, it still is. The American political landscape would be unimaginably different today if he had survived.

It had only happened once, and was therefore a unique event from 1865 to 1881.

 In the second assassination, the killer had been immediately apprehended, had been given as fair a trial as could possibly have been expected, and had revealed himself to be a horrible human being with a twisted sense of reality rather than any principled assassin. Guiteau's hanging was a foregone conclusion, but the fact that it had all happened as expected, with no loose ends, meant that the American people could finally come to grips with political assassination and not treat Lincoln's death as a unique horrifying tragedy. When seeking an explanation for why it had to happen this time, they could blandly note that Garfield was a martyr for civil service reform. That is, seemingly, the get-out-of-jail-free card for knowledge on Garfield- just mention "civil service reform" and you are doing as good a job as some textbooks.

Scholarly and dignified, Garfield had a clean image then, just as now. He had his enemies, but most of them were not enemies of principle. No scandals or improprieties are alleged of the Garfield administration. It was easy for the public to try and make sense of the death as if he were a martyr for the cause of civil service reform.

At the same time, the real effects of his death were somewhat harder to grasp. In the same way that LBJ would later feel bound by Kennedy's challenge to fulfill the Moon landing by 1969, Chester Arthur had assumed the office with a mandate to do what his predecessor wanted done. This occurred humbly, patiently, even graciously on Arthur's part, even though Arthur, a Stalwart, had had a reputation for supporting the spoils system and being a giver and recipient of political patronage in New York politics. The highlight of Arthur's presidency was the Pendleton Act, a groundbreaking civil service bill. During the Republican National Convention of 1884, the GOP failed to nominate him for a second term, so he retired from politics at the end of his first term. Arthur's transformation had been totally unexpected; he never gained the trust of reform-minded Republicans, but he also threw away his good graces with the Stalwarts. Arthur was popular with moderates of his time, but he would never have ascended to the Presidency if not for Garfield's death. The Stalwart faction of the Republican Party might have remained strong if Garfield had not been assassinated and its main reason for existence discredited.

Garfield, if he were alive today, may not be pleased with the current state of the federal government, but on a practical level, the tangible motivation for his assassin would never be a factor again in the future, since civil service reform would prevent most job-seekers from benefiting from a personal audience with the President. Whether the reform actually did ensure that appointees were the best people for the job is up for debate, but the fact is that newly elected Presidents afterward would not spend months re-appointing an entirely new Executive Branch from the ground up, and no Presidents afterward would be attacked by a disgruntled office seeker.