It's election season in the U.S., and the television is filled again with advertisements discussing foreign workers.
This is a large subject, but let me emphasize one thing: the ads have it backwards for software jobs. I want foreigners working with me. They are valuable members of the teams I've been on, and they create as many jobs as the take.
Whenever one of my coworkers has visa trouble, it's a real harm to the team. We lose lots of time, often days if not weeks, just due to the person filing papers, making phone calls, and travelling to offices. The national offices involved are far from friendly about the whole thing, either. They often keep bank hours, and they sometimes require presence in person. If you make any little mistake, the letters don't say, "You seem to have forgotten to file form IS-1042-T. Could you please resubmit it?" They are more like, "Get out, you rotten terrorist scum! If you aren't gone by tomorrow, your assets will be seized." This all leads to a situation where the person isn't in the best frame of mind to do good work.
Supposedly the point of this is to protect American workers. The economics behind that doesn't apply to software, however. Most of the ones I've worked with have a backlog of 5-10 times the amount of work they are doing that would be valuable to do if only they had a clone. When a foreign worker comes to the U.S. to work on computers, they don't knock someone else out of a job. They do one of those things that was previously being left on the table.
Moreover, having more people in the industry means that we all get smarter. They enrich the intellectual community. Smarter programmers are more productive, and more productive programmers make higher wages. Without foreign workers, we aren't as capable as we could be.
In short, I truly wish that most all barriers to foreign workers would be dropped in my industry. They're based on xenophobia and bigotry, and I'm embarrassed every time one of my coworkers must deal with it. If someone can get a computer job in the U.S., then let them come. They expand the pie by far more than they consume.
Thursday, September 30, 2010
Sunday, September 12, 2010
It really was just about Flash
There are a number of platform wars going on right now, on various classes of computers. One of them is over the market for applications on consumer mobile devices. At the OS level, there are Android, iOS, Windows, RIM, and others. There are also cross-OS platforms, such as HTML and Flash. It's a good time to be on the buying side of a mobile device. Extraordinary levels of effort are being put into making each platform appeal to users.
Sometimes, though, the moves are not in consumers' interest. Apple's ban of alternate programming languages on the iPhone is just such a move. Jobs can say all he likes that Flash apps are inherently bad, but few truly agree. A more precise statement is that Flash, many feel, isn't the best possible tool in general. Programmers, however, are more important than the specific tools. I'm sure that the best Flash apps that were banned are better than the worst apps currently being allowed. If the app store simply focused on quality itself, rather than implementation technology, then iPhone users would get an improved selection of apps to install.
Jobs knows this, and so he hasn't really been blocking all alternate languages from his platform. Just the Flash ones:
Sick stuff. Happily, as word came out, the legality of the approach is starting to fray. Apple needs to either explicitly and specifically block Flash--thus facing anti-trust issues--or drop the bogusly general block. They've now chosen to drop the general ban, which is really the best thing for users.
Sometimes, though, the moves are not in consumers' interest. Apple's ban of alternate programming languages on the iPhone is just such a move. Jobs can say all he likes that Flash apps are inherently bad, but few truly agree. A more precise statement is that Flash, many feel, isn't the best possible tool in general. Programmers, however, are more important than the specific tools. I'm sure that the best Flash apps that were banned are better than the worst apps currently being allowed. If the app store simply focused on quality itself, rather than implementation technology, then iPhone users would get an improved selection of apps to install.
Jobs knows this, and so he hasn't really been blocking all alternate languages from his platform. Just the Flash ones:
Other cross-platform compiler makers had had no such trouble, even during the monthslong stretch when the now-obsolete Apple policy had supposedly been in effect. Both Appcelerator and Unity Technologies, which sell iOS programming tools, stressed on Thursday that developers using their compilers had been able to get ported programs into the App Store since April.
Sick stuff. Happily, as word came out, the legality of the approach is starting to fray. Apple needs to either explicitly and specifically block Flash--thus facing anti-trust issues--or drop the bogusly general block. They've now chosen to drop the general ban, which is really the best thing for users.
Sunday, September 5, 2010
The most important problem in computer science
Richard Lipton chooses The Projector Problem:
Here here. What makes this so hard? I've had a lot of opportunity to muse on it while waiting at the beginning of talks while people fiddle with projector and laptop settings. Here are the main ones that have come to mind:
This doesn't seem like a terribly challenging problem, really. It just takes a laptop maker to consider it a problem worth solving. A laptop maker could, if it chose, have a VGA port with a big "Project" button next to it. When pressed, it would switch to mirrored display mode at the max resolution the projector supports, and it would pop up a dialog asking if everything looks ok. If the user clicks Yes, that's it -- done. If the user clicks no, it would switch back to the previous resolution and drop the user into a settings dialog.
Would any laptop maker care to do this, or are you all going to keep working on those gimicky CD player buttons?
I believe that we are sorely in need of an Edison who can invent a projector system that actually works.
Here here. What makes this so hard? I've had a lot of opportunity to muse on it while waiting at the beginning of talks while people fiddle with projector and laptop settings. Here are the main ones that have come to mind:
- It takes an obscure manual intervention to turn on the projector output. Thinkpads have the best option, but it's not saying much: you hold down Fn and press F5. On other systems, you have to fish around in the UI for the screen settings dialog. Given the importance of this problem, I think it deserves a large button right next to the VGA port.
- The laptop doesn't detect the projector's resolution. I'm sure it can, because CRTs have had this ability for eons. It's vanishingly rare that anyone will want a resolution other than the projector's max resolution, but for some reason the screen settings UIs don't just do that for you. In many cases they don't even gray out the settings that aren't going to work on that projector.
- The settings UIs are universally terrible for switching to projection mode. On an Apple, you are given a list of 10-20 resolutions, almost all of which are bad ideas. On Windows, you have to click to a separate tab to even get to the place where you can turn on projector output and modify resolution. The NVidia UI on Linux, meanwhile, takes the horror to a new level. It would take a small novel to describe it all, so let me just mention that it involves knowing what "Twinview" is. Thanks, NVidia. You took what should be a trivial problem and instead of just making it work, you are making it a teaching moment for customers to learn your brand names.
- If you use an Apple product, you additionally have the problem of finding the right dongle before you can plug in. These things are like pencils and pens: no matter how much you replenish the supply, they keep disappearing. Once one conference room loses one of its dongles, people start borrowing them between rooms, so they all share in the pain. I tried carrying my own, but that doesn't work, because eventually I loan it out and it disappears. There must be some alternate universe that is collecting all the Apple dongles from this one. They really just disappear.
This doesn't seem like a terribly challenging problem, really. It just takes a laptop maker to consider it a problem worth solving. A laptop maker could, if it chose, have a VGA port with a big "Project" button next to it. When pressed, it would switch to mirrored display mode at the max resolution the projector supports, and it would pop up a dialog asking if everything looks ok. If the user clicks Yes, that's it -- done. If the user clicks no, it would switch back to the previous resolution and drop the user into a settings dialog.
Would any laptop maker care to do this, or are you all going to keep working on those gimicky CD player buttons?
Monday, August 30, 2010
Patents as Mutual Assured Destruction
The best way I can understand the popularity of software patents is that they protect incumbent companies from newcomers. Large incumbent companies accumulate patents, they use them to litigate against small newcomers who have no patents of their own, and they form patent-sharing agreements with each other to prevent the same thing from happening to them. Occasionally it comes back to bite one of the incumbents, but for the most part they seem to believe it comes out in their favor.
One place this arrangement fails, though, is if one of the incumbents decides not to play for the long term. See, the reason incumbents don't sue each other over patents is that they fear the counter-suit. It's classic M.A.D.: mutual assured destruction.
However, what if an incumbent is on their way out of the computer business, either because they are shifting focus or because they are retiring? Well, in that case, the fear of a counter-suit would be nonexistent, wouldn't it? Count me in as one who thinks Paul Allen's recent actions suggest he is planning to retire, or at the very least get out of computers. My next best guesses are that he is trying to make some sort of point, or that he is simply unsavvy about the software industry. Neither of these sounds especially likely.
One place this arrangement fails, though, is if one of the incumbents decides not to play for the long term. See, the reason incumbents don't sue each other over patents is that they fear the counter-suit. It's classic M.A.D.: mutual assured destruction.
However, what if an incumbent is on their way out of the computer business, either because they are shifting focus or because they are retiring? Well, in that case, the fear of a counter-suit would be nonexistent, wouldn't it? Count me in as one who thinks Paul Allen's recent actions suggest he is planning to retire, or at the very least get out of computers. My next best guesses are that he is trying to make some sort of point, or that he is simply unsavvy about the software industry. Neither of these sounds especially likely.
Wednesday, August 11, 2010
Rubik's Cube's difficulty cracked
I'm late to notice this, but recently a team proved that a Rubik's Cube can always be solved in 20 moves or less, regardless of its initial configuration. It had already been proven, back in 1995, that at least one initial configuration requires 20 moves to solve. Thus, all positions can be solved in 20 moves, and some positions require the full 20.
What is particularly interesting is that the these researchers found the 20-move bound by having a computer solve all of the 4E19 initial positions exhaustively. The best proof that didn't do an exhaustive search only proved an upper bound of 22 moves.
I wonder if a human-digestible proof will be found for the 20-move upper bound, or if we'll be left with computers generating the stronger proof? At any rate, this is yet one more problem where a mathematical result depends crucially on some very heavy computation.
What is particularly interesting is that the these researchers found the 20-move bound by having a computer solve all of the 4E19 initial positions exhaustively. The best proof that didn't do an exhaustive search only proved an upper bound of 22 moves.
I wonder if a human-digestible proof will be found for the 20-move upper bound, or if we'll be left with computers generating the stronger proof? At any rate, this is yet one more problem where a mathematical result depends crucially on some very heavy computation.
Thursday, July 8, 2010
Pseudonymity
People participating in online forums are better off being identified by pseudonyms rather than by their legal names. This is pretty engrained in me after many years of participating in such forums, so it takes some soul searching to explain. Let me try and distill out three points.
First, people have multiple parts of their lives, and they don't want them to mix. There are many reasons why this is good, but at the very least let's observe that this is how most people arrange their lives. There's work, and there's play. On the Internet, pseudonyms allow these separate lives to be separated more effectively.
Second, it fights prejudice. What makes prejudice so bad is not just that people are judged wrongly, but that they are judged wrongly using information that really should be irrelevant. Using pseudonyms means that this irrelevant information can be completely non-present. If your name is Julie or Juan or Duk-Kwan, you can expect to get a different--unfairly different--reaction if people learn your name, and thus your probable gender or ethnicity.
Finally, let me emphasize that pseudonyms are not anonymous. They are actual names, and they accumulate a reputation just like any other name. "Tom Cruise" is a pseudonym, but it's a name that has a very strong reputation (of one sort or another). So it goes with online pseudonyms, as well.
Given this, readers won't be surprised that I oppose Blizzard's trend toward using a "real" ID, "real" meaning a name on the credit card that pays for an account. Already, if you want to participate in cross-server chat on their games, you have to expose your credit-card name to everyone on your cross-server friends list. Now they are talking about changing the official forums to use credit-card names rather than
The idea seams to be that if people post under their credit-card names rather than their Warcraft character names, then they'll post better content to the forums. I don't agree this is a sufficient reason for the change, and I don't even think they are going to get the result the hope for.
Aside from all this heavy stuff, why in the world is a fantasy online computer game going this way? Grey Shade says it best:
UPDATE: Blizzard cancelled enforced real names on the forums, and said they are going to strive to prevent real names leaking in-game for people who want that. Good choices! Crisis averted. Everyone can go back, now, to killing Internet dragons.
First, people have multiple parts of their lives, and they don't want them to mix. There are many reasons why this is good, but at the very least let's observe that this is how most people arrange their lives. There's work, and there's play. On the Internet, pseudonyms allow these separate lives to be separated more effectively.
Second, it fights prejudice. What makes prejudice so bad is not just that people are judged wrongly, but that they are judged wrongly using information that really should be irrelevant. Using pseudonyms means that this irrelevant information can be completely non-present. If your name is Julie or Juan or Duk-Kwan, you can expect to get a different--unfairly different--reaction if people learn your name, and thus your probable gender or ethnicity.
Finally, let me emphasize that pseudonyms are not anonymous. They are actual names, and they accumulate a reputation just like any other name. "Tom Cruise" is a pseudonym, but it's a name that has a very strong reputation (of one sort or another). So it goes with online pseudonyms, as well.
Given this, readers won't be surprised that I oppose Blizzard's trend toward using a "real" ID, "real" meaning a name on the credit card that pays for an account. Already, if you want to participate in cross-server chat on their games, you have to expose your credit-card name to everyone on your cross-server friends list. Now they are talking about changing the official forums to use credit-card names rather than
The idea seams to be that if people post under their credit-card names rather than their Warcraft character names, then they'll post better content to the forums. I don't agree this is a sufficient reason for the change, and I don't even think they are going to get the result the hope for.
Aside from all this heavy stuff, why in the world is a fantasy online computer game going this way? Grey Shade says it best:
But that’s it, you get it? That’s why I play. That’s why my friends play. Because we like to come home from a long day of being John Smith or Jane Doe and get on the computer and MURDER SOME REALLY AWESOME INTERNET DRAGONS.
UPDATE: Blizzard cancelled enforced real names on the forums, and said they are going to strive to prevent real names leaking in-game for people who want that. Good choices! Crisis averted. Everyone can go back, now, to killing Internet dragons.
Thursday, July 1, 2010
A foreign film I'd love to see
Java 4 ever is a hilarious trailer for a made up movie. It has all the cliches from a Hollywood warm-human-tale kinda movie, but instead of being about forbidden cross-sect love, it's about open computer standards.
Beware that the trailer is R rated at one point.
HT Ted Neward
Beware that the trailer is R rated at one point.
HT Ted Neward
Subscribe to:
Posts (Atom)