As I read over these things I can help but wonder, have they not completely missed the boat here? The decision by Apple to not include a physical keyboard, but only a virtual one was without a doubt a huge, nay epic, decision. While the folks over at RIM were scratching their head on how they could possibly make QWERTY keyboard keys smaller, and even resorting to putting two letters on a single key, Apple saw the issue and sidestepped it like the plague. Of course, this decision was very much a gamble, are people ready for this? Are they willing to give up a physical keyboard? The answer is yes. While I can hear the hardcore Blackberry users yelling, I believe this is a case where they need to be ignored (end users don't like to hear this, but there are occasions when its true).
Want proof? Take a gander at the Blackberry Storm and the Android G1.
Like the iPhone, the Storm chose to skip over the physical keyboard and in its place put a virtual one with "haptic-feedback" (the much discussed 'clicking'). I had the chance to play with a Storm briefly this past week and found it to be a nice phone, and while it did take me some time to get used to the keyboard I could see myself becoming quite good with it after a few days of use. Oddly enough, one of the common questions/complaints with the Storm is whether or not the 'click' can be turned off and the keyboard can just be used as a touchscreen (the answer is no). While were on this tour d'phone, it makes sense to discuss the Android G1 which did choose to include a physical keyboard in its design. What feature are the Android developers feverishly working on? That's right, a virtual keyboard. While I have not heard many complaints about the G1's keyboard (granted I don't know anyone with one) people still have a desire for a virtual keyboard, and how convenient it is to be able to immediately type away without having to flip up a screen.
The transition to a virtual keyboard can no doubt be a unpleasant one, but realistically a keyboard the size of a business card (virtual or not) is going take some getting used to. Before you start pondering how you are going to text while driving on a virtual keyboard, you wont have the problem soon, it becomes illegal in California in two days. Problem solved.
After having used the iPhone keyboard for a while now, I can say that the only real issue with it is the lack of landscape typing in crucial applications like Mail and SMS. While I do fine with the vertical keyboard, I think the addition of sideways typing could really make it easier to type away on a long email. I am in no way the only person who feels this way, and I know Apple has received may-a-request for this.
Getting back to the original point of the article, the decision to omit a physical keyboard was no doubt a large one, but it is not something I see being reversed. Virtual keyboards are going to continue to improve to the point that we will look back and laugh at the Blackberries with rice-grain sized keys. So those of you holding out for an iPhone with a full QWERTY, don't hold your breath, or better yet keep dreaming.
Just a few quick note on the other additional features added to the phone:
60GB of space - Sure, who doesn't want more space? That's just a matter of price, SSDs aren't getting cheaper as fast as anyone would like
Front-facing camera - Word is the G2 is going to have this, well see. The type of video you would see (and transmit) on a 3G network in real-time seems pretty awful. While there are physical limitations which constrain this, I think the network is another big consideration.
Optical-zoom camera - No doubt the iPhone camera is beyond pathetic. The lack of even primitive video capture abilities is quite perplexing considering that every cheapish cell today can do this. I think the full zoom is a bit much. Realistically I see improvements to the camera in the future, but its still going to be second-class in the camera department, as are all cellphone cameras.
The first thing to understand about BitTorrent and TCP (what it currently uses/defaults to) is that the two are a poor fit. If you think about the way BitTorrent works, it generally creates lots of connections for short lived high throughput transfers. Unlike previous P2P applications (e.g. Kazaa/Napster), files on BitTorrent are split into small chunks, this way the chunks can be downloaded from different hosts simultaneously and likley signficantly faster. Chunks themselves are generally quite small (1MB by default), furthermore, most people tend to have high (broadband) speed connections, thus the duration of the connections to individual hosts is short. These factors make TCP a terrible fit for use in BitTorrent. Basically, when TCP starts a connection it begins sending very slowly (1 in-flight packet) and (exponentially) builds up until it a packet is lost which signals the sender that they may be going too fast. While the exponential increase should lead the sender to reach the maximum send rate quickly it is not quick enough. With the high speed broadband connections available today, it actually takes a reasonable number of round-trips for TCP to reach its maximum send rate. Based on some back of the napkin calculations a 1Mbps link requires 93 in-flight packets to be sent (assuming 1500 byte packet size and 100ms RTT, Throughput=.75*(window size/RTT)). Starting at 1 packet, it will take slow-start 7 round trips (2^7=128) to get up to that rate, and 1Mbps isnt even that fast.... Since the connections made are generally short-lived they will likley spend most of their time in slow start sending much slower then it can actually be.
With all of these factors it comes as no surprise that the folks over at BitTorrent want to make their own protocol which they can optimize for torrenting. Especially since torrent traffic is completely different from the types of traffic TCP was designed for and performs well with. The protocol and the client itself is proprietary which is going to be a point of contention with quite a few people I am sure.
The reason why some have predicted the meltdown of the internet is because UDP does not employ any congestion-control or congestion-avoidance schemes like TCP does, it instead continues to blast away packets as fast as it can. This is the basic version of UDP, the nice thing about UDP is that one add to it the services that they require, and omit those that they dont. TCP already has so many things built it (reliability, etc) that result in poor performance that they simply chose to build it on their own, only including those things that they need while not having to navigate around those things which TCP has already included.
As for now, the UDP based protocol will likley avoid many of the throttling mechanisms put into place by ISPs (e.g. Comcast) to help curb downloading. I dont believe that this is an attempt to avoid these mechanisms since making the traffic UDP allows it to be much easier to distinguish between torrent traffic and other traffic whereas with TCP it is a somewhat more convoluted process. I suspect that this will make it easier for ISPs to classify torrent traffic, the question is what will they do with it.
I just read this article over at networkperformancedaily.com where the author actually talks to the VP over at BitTorrent Inc. about why they chose to make a protocol, based on what they said it does in fact seem like they are doing the responsible thing, even stating that the protocol needs to be "MORE sensitive to latency then TCP." He also says that it is a "scavenger protocol. It scavanges and uses bandwidth that is not being used by other applications at all.." I think this is the absolutely correct approach to be taking. Just as a kicker to the above article, the first comment posted was by Richard Bennet, the author of the Register's "The Internet is going to meltdown" article.
The more I think about it, the more clever the idea is. Think about this, although Sears is a tried and true name in their business, their reputation with younger audiences is less...inviting. Yes, if my dad is looking to buy a new washer and drier he'd probably look at Sears, but what do I need there? Sears is about to tell us (the younger audience).
This advertising methodology relies on two main components. The first part is very similar to rebates, sure it all sounds good on paper but how many people will actually go through with it? Sears has eased the process by shipping you the card, but in order to give them your information you need to register on a website within 24 hours of becoming a fan. I'm quite sure that many of the people who do become a fan will miss this little tid-bit and end up giftcard-less.
Now that Sears has you as a registered "fan" of their store, they have a direct communication channel with you. They are now prime to "update their fans" on all of the new and exciting offerings from Sears, think about how many eyes will see these updates relative to advertising in the newspaper, and at what cost. Furthermore, updates on their site will be displayed on the fans news feed (you know, this one) Thats a nice piece of real-estate that Sears will get to inhabit when posting updates. That is the first page the 125 million Facebook users see when they login to their account, which apparently over 50% do daily (oft quoted but apparently never cited, sorry).
All in all, although the kneejerk reaction isn't a goood one, I think Sears may be onto something. It is definately an appropriate campaign for them, and I suspect you will start to see quite a few like it pop up quite soon, especially with holiday season coming up.
file = open(filename,'r')
#Get the total file size
file_size = os.stat(filename)
#Seek to a place in the file which is a random distance away
#Mod by file size so that it wraps around to the beginning
#dont use the first readline since it may fall in the middle of a line
#this will return the next (complete) line from the file
line = file.readline()
#here is your random line in the file
Using this method has shown to be MUCH faster then doing something along the lines of:
for line in file:
and furthermore, this method does not make it easy to get a random entry in the file.
Update: Just a note on the "file.seek" line above, the "file.tell()" call is not strictly necessary since what line you are on currently has no bearing on the next line you will be going to. That would also remove the need for the mod operator since the randint would never be larger than the file. Furthermore, since you are advancing the file pointer to a random spot in the file, if all lines are not an equal length then you will not get true randomness, longer lines will in fact be more likely to be hit. Since, however, the algorithm takes the line after random spot seeked to in the file, this issue is lessened but not resolved. I'm looking for a better and more efficient solution in the meantime.
With that in mind...
One thing that is infinitely entertaining to study and see is the techniques that we all employ dealing with the different parts of our daily lives. Things like email, that we all use, and all have our own ways of managing. For example, when I check email I do my best to ensure that when I am done that my inbox has 0 unread messages. Of course I don't read every email, I tend to do the "mark as read" bit quite often and it seems as if many other people do the same given how prominently placed it is on just about every email site. I use labels/folders and filters to separate out emails into different groups, nothing wild there. I often catch glimpses of other peoples mailboxes and it is interesting to see how they manage theirs. I'm not going to go into details on all the interesting and bizarre things I've seen, but it seems to work for them (at least for the time being).
So now we arrive at RSS feeds, which similar to email, requires a some time to figure out how to manage. I know that I have 22 RSS feeds which I subscribe to and the range of post frequency between them varies quite widely. On one end I subscribe the the feeds of a few authors books who I've enjoyed and generally post a few times a week if that, and on the other end of the spectrum there are things like "deal" websites which have a few 100 posts a day. In the middle I have the gamut of tech sites (a la Slashdot, Gizmodo, etc) which tend to have roughly 10-20 posts per day.
Obviously the sites I subscribe to contain a wide range of things I want to keep track of, but at the same time is brings all of those things together, for better or worse. Before, if I wanted to get my news I would browse through a few sites with how many depending on how much time or interest I had at that particular time. In general I tend to look at things in groups, so may start with regular news, then tech news, then a few blogs, etc. So I would get all the information from a type of site that I was interested in, then move on to another. With RSS feeds everything is all in one location, so I can quickly see posts on all of the sites, and it tells me when there are new articles to read. Unfortunately that means I frequently find myself switching from Techbargains to xkcd a bit too quick and I dont properly "context switch" between them. As a result I often lacklusterly browse the posts in the feed when I really wasnt in the mood to read that particular information at that time, and if I had read it later I would likely have done a much better or more thorough job. Given that most people, including myself employ the "only show me unread messages/posts" feature (if I already saw it, why see it again?), we will likely not see that post again.
All this leads me to the fact that I simply dont know how to read RSS feeds. I am working on my own technique, but it simply takes time to develop. I thought that after already having one for email the rest would be easy, but thats not quite the case.
I think RSS is one of those technologies that is going to be hear for quite a while, its really a viable way for information transfer (once we figure out how to use it). For a testament to it, and how acceptable it is becoming, open up your Facebook. What is the first thing you see, yep, your "News Feed". Surprised?
I am also glad to see that Google is taking a hard stance on the cloud downtime complaints and fears. This is one of those fears that is continuously bring up when you talk about cloud services. It will be interesting to see what sort of response this gets, I suspect plenty of people out there are going to be investigating.
Seeing as how questions come in packs, the immediate follow up to this is, “isn't there already Windows Server 2008? What wrong with that?” Yes, both operating systems are designed to run on servers, but by making an OS specific to the cloud there are quite a few optimizations which can be made to get even more performance out of the machines. In the case of Azure it is definitely going to be run as a virtual machine inside the data center much in the same way that Linux/Windows images are virtualized on EC2. Virtualization is really the key to making the cloud possible. It allows multiple OS instances to run on the same physical machine while encapsulating them from each other. That way one VM cant mess with anothers data or processes. Also, when someone is done with their VM, they simply replace it with a clean VM image and its as if they were never there. On EC2 they run up to four VMs per machine, I highly suspect that by trimming down their server OS they will be able to get a lot more VMs running reasonably on a single physical machine (I wont speculate as to the exact number, but I think that number will be at least double digits, maybe more?). Keep in mind that as opposed to EC2 where users have direct access to the OS, on the Azure platform users wont have direct access to the operating system. Therefore Microsoft can toy with the number of VMs to figure out the optimal number to run on their servers.
So in summary, the new OS is not absolutely necessary, they probably could have done it on Server 2008. But, given the scale they are shooting for, and the amount of optimization they can achieve by making a specific OS it is no real surprise they made a new version specifically for this function.
Today, at PDC, Microsoft announced Azure, its cloud computing platform. Just so it is clear, Azure is also the name of the operating system that the Azure platform will run on (previously dubbed Windows Cloud by Ballmer). I will address the "Why does the cloud need another OS" question in the next post, for now lets just stick to the platform. For those of you unfamiliar with the cloud, it effectively breaks down into three different levels of services which can be provided (see this Tim O'Rielly article for a full definition of the three layers). Azure falls into the platform layer, which is the middle of the three layer hierarchy. As the title suggests, this layer provides a “platform” which developers can utilize to write and run their application. For example, lets say that I want to create a web application which allows users to input their contacts and access them from anywhere. If I were to do this using a framework like Rails (with Ruby) or Django (with Python) then after designing the application itself, I would also have to set up a database which would be used by the application to store the data like user login information and the actual contacts themselves. Using the platform layer, instead of running a database myself, they (the service provider) run the database for me, and I just access it though the provided interface. I still determine what tables I want, how I want them to be laid out, its just that instead of actually writing the SELECT statement, I call a SELECT method. It is up to the provider to figure out how they should host the database and make it scale. Similarly, I as the user dont have to set up the platform and the underlying operating system, this is handled by the provider, note how this is different from the infrastructure services (a la Amazon EC2). Basically, I would write the contact application, making sure that the database interaction would go through their API, once I'm done (with this release) I then hand it over to the provider. It is up to them to figure out how to deploy it and scale it up and down as necessary. Again, very different from EC2 which requires the user to determine how to deploy it and manually scale up and down the number of machines (or use a third-party service like RightScale).
Ok, now that you have a better understanding of what a platform and platform service is, it should be clear that Azure's main competitor is Google App Engine. Many people will lump Azure and Google App Engine together with EC2 but in fact they provide very different levels of service. Think about it in terms of a car. Amazon EC2 is really the infrastructure, its as if someone gave you the frame of a car, it is up to you as the car-builder to find the appropriate engine, tires, and seats depending on what you intend on doing with it (drive on the freeway vs entering in a monster truck rally). With platform services its as if you were given the frame of the car, with the engine, and tires already installed (and you cant remove them). The key parts of the car are there, its up to you to put in things like the seats and gas in. Notice however that you dont have a choice on what engine and tires were provided, so they may fit your needs, but they may not. You have less control of the infrastructure in the sense that it is provided for you and you cant change it, but you also save yourself the complexity of having to figure out what engine would fit in your car, how many cylinders etc.
1.Incompatibility was and still is an issue, albeit the tip of the iceberg
2.UAC is annoying
3.Vista is bloated
4.Vista tried to hard to be pretty
I suspect that if have read other articles about Vista you have probably seen some combination of this complaints in the various other articles. This is by no means groundbreaking stuff, in fact I'd bet I could probably find a few articles from 6 months or even a year ago that present just about the same argument. Even still, that doesn't prevent you from getting it wrong.
Incompatibility was definitely a big issue when it came out, there is no getting around that. Basically, the second hardware vendors saw that Vista was getting a lukewarm reception they put Vista drivers a few notches lower on the list (aka they weren't going to do it, or at least quickly). Then, when some user goes to install their Brothers printer and it doesn't work they get mad. Is that Vistas fault, or the hardware vendors fault? No reason to point fingers, it just stinks for the end user. In either case, Vista post SP1 incompatibility is unlikely to be the issue. Sufficient time has elapsed for even the slowest of vendors to get Vista drivers out. If they have no done so already, I definitely put the blame on vendors, they have had long enough.
Good ol' UAC seems to be pissing a lot of people off, does this not strike you as odd? OS X and Linux have had the equivalent of UAC since their inception, but no one seems to complain about it there. Yes, UAC likely pops up a lot more often then you are used to in OS X and Linux, but in all reality, if it is popping up, it likely should be. People have become so accustomed to the pre-Vista Windows where nothing pops up to notify you that some terrible website is trying to put a Trojan on your system. Realistically that is just a poor model for security and is likely to result in a lot worse results (keeping a clean system) then the UAC model. It is a lot easier to prevent malware from installing, then it is trying to remove it once it has already been.
I think when these writers think of Vista they get the image of the Micheline tire man running in their head. How exactly people determine that Vista is “bloated” is a very subjective process and in all likelihood does not imply that they have any actual knowledge of the underlying operating system. Much of this perception comes from people seeing the amount of RAM Vista uses by default compared to other OS's. Its hard not to make that association, but it simply is not valid in this case, not by default at least. Vista has a feature called SuperFetch which proactively puts often and recently used programs in RAM even before it is started by the user. Why? Its simple, if they guess what you are going to start and put it in RAM before you start it, start up time is drastically faster then it would be otherwise. So when someone looks at Vistas RAM usage and sees that it is using 2GB on start they are wondering “How is it using so much RAM? I haven't even started anything!” Chalk a lot of that usage up to SuperFetch, and hope that SuperFetch uses as much RAM as possible. Using RAM is GOOD, thats what its there for, and as long as SuperFetch gives it up when necessary all is well in Windows world.
Even still, the Vista bloat argument continues. Consider this, when Macs changed from PPC to Intel hardware it was a HUGE change. It required a new OS, programs to be rewritten, and just about anything Apple related that was PPC became outdated. They effectively drew a line in the sand and said it stops here, anything older then this we are done with. What happened? People bought new Macs. Those left with PPC's were left to wither away out in the cold. This wasn't too terrible for those users since Mac users (and Apple users) in generally tend to be a lot more dedicated to the brand then most people are to just about anything. Lets say Microsoft employed this strategy with their next version of Windows (clearly they arent with Windows 7, but lets just say for the sake of argument). I don't think that any of us could imagine the amount of uproar such a decision would make. Even though Microsoft wouldn't be changing the hardware (they don't make their own hardware, Apple does), the software changes alone would bring about so much complaining we wont be able to hear ourselves think. The sheer amount of labor required to rewrite software for the new Windows would be enormous and an absolute nightmare. You think people are pissed about Vista, the amount of headache this would cause would make Vista seem like a blip on the radar. Based on the amount of people who use Windows, and the amount of software written for Windows it is going to be much more difficult for Microsoft to draw that line in the sand.
Finally, there is always the claim that Vista is trying to be too much like OS X and be too pretty. Its no secret that Microsoft significantly dressed up Vista compared to it predecessors, and its hard to argue that Vista doesn't look better. Whether or not it is worth the resources it draws is a matter of preference. To me, it is. So Vista is trying to be too much like OS X? Really? Are we talking about specifics in the UI, because I don't see many. Your not going to see Windows with a “dock” any time soon (especially now since its patented), I cant see Windows using left side close and minimize buttons, or adopting the silver theme by default, so where is the copying? If you are simply making the case that they are trying to make Vista pretty, then you cant fault them for that, part of being a successful operating system is being pretty. Dont believe me, ask Linux? One of the big pushes in Linux over the past few years has been beautification, and it has paid off. Compare your distro of choice now to how it looked 2-3 years ago and you will see that it is likely a lot prettier, shinier, and slick overall. Linux demonstrates this case so beatifically, pretty is a very important part of being a good OS, like it or not.
I suspect with Windows 7 hype continuing to build, more and more of these articles will be resurrected from the murky puddles they belonged in. Please, if you are going to make an argument about Vista being “sucky” think about it before you just regurgitate the arguments of others.
The only real advertising I could really see happening in games is product placement like they do in TV shows where someone like Pepsi will pay to get some member of a TV show to take a sip of their soda on camera. Cant you just see Tiger taking a nice big swig of Pepsi after blasting a 350 yard drive down the 18th of St. Andrews?
In the past few years the amount of electronic equipment that the average traveler carries with them has gone up significantly (% increase from 0->1 is infinity). So when people are traveling in the airport between flights aside from eating, what are people looking to do? Thats right, charge their
Considering how little traveling I do compared to the very sharply dressed gentleman next to me who is also clamoring for an outlet it is clear that I am not the only one who has these thoughts? Considering the burlesque show that we have to put on going through airport security can we at least get some power outlets? I mean if thats not the oversight of the century, I don't know what is.
In case you haven’t heard, the excellent website Hulu will be premiering many primetime TV shows either on or before their on-air premier dates. In case you weren’t aware, Hulu is a very good online video site which has a good selection of TV shows and movies which can be streamed and is shown with minimal advertising. For a 30 minute TV show there are generally 3 commercials which are 15-30 seconds long. I was reading this article on Wired which brought up the interesting point, with Hulu airing the premier of Prison Break on the same day it will premier on TV, why have over a million people downloaded it (illegally) via torrent? This situation is not unique. On October 10th of 2007 Radiohead released their album “In Rainbows” only on their website. Fans were allowed to pay as much as they wanted (including $0, free) for the album, which was available by download. Well, no surprise when shortly thereafter the album appeared on torrent sites. Again, if it is available for free why would an estimated 2.3 million people download it (illegally) via torrent?
Note: while I would never encroach on either of these actions, I may know a few that do, and so it is their coalesced opinions and my own spin that I include here.
The answer is not made of up a few different sets of people.
First, there are the people that didn’t know about that availability, and therefore went to their main method of acquiring that content, torrent. The same way they have done for all the other episodes of Prison Break or Radiohead albums, fire up the tracker of choice and download away. These people cant be blamed, although publicized, the number of people who knew about them being available was still a very small set of all of those whom would otherwise be downloading. They might be on the front of the tech-section of every online news/blog site, but you have to read it to know.
Next, there are people who heard about it, but were too late. They had already downloaded the show/album before they heard of the promotion. At that point they could go back and watch it on Hulu or download it from the Radiohead site, but much of the allure is lost, although both sites have their revenue opportunities intact which I’m sure they are pleased about.
Finally, there are those people who did not agree with the terms under which they would get the media. Ads in the case of Hulu and a required email address and an ask for donations on the part of Radiohead. To be honest, I really don’t find either to bad at all. Ads on Hulu are significantly shorter then those on primetime TV, you generally only see one per break, and you have all the abilities of a DVR (pause, rewind, etc) without needing one. Radiohead simply asked for an email address (no spam to date) and ask for donations (which they made clear could be $0, no CC# required). What’s not to like? Some people just wont give in…what can I say?
As for these two occurrences, I think they are incredible. They demonstrate an embrace of technology by the two industries which have classically fought it off like Ugg boots. I just have two comments about these:
1. Hulu, why on earth do you not have an iPhone app out yet? Are you waiting for me to make it?
2. Radiohead, why not make the album available via torrent, you save on the hosting cost and make it more widely distributed? Seeing as 2/3 of the downloads came from torrent, why not focus on those people, and hitting them up for donations instead?
I’m not really sure what people were expecting this ad to people. Based on my very empirical study this ad was expected to be the hippest, coolest, most ultimate
It was mildly entertaining, but only since it relies two of the most famous faces in the world (t-minus 10 years ago of course). The premise is a play on BillG’s well-known…thriftiness…. And has some shout-outs to those nerds who have been paying attention. In terms of Seinfeld, its roughly the Visa ads, except without Superman.
I think the real surprise of it all is the lack of heavy branding or slogans. Spanning the entire minute and thirty seconds of commercial the word Microsoft was mentioned once, in passing, by Seinfeld. Then there is the slogan “The future. Delicious.” Followed by the Windows logo. No Microsoft logo, no mentions of the word Windows, or dare I say…
I’m sure pundits are savagely pounding at their keyboards about this one, “$300 million for this!” or “that’s going to beat Apple?!” Stop it. Please. Just, stop it. First, I’m very sure that this is the first of many (random guess says 5) commercials to come out of this campaign. Second, like I have said you will not see direct shots at Apple. Why? By “competing” with Apple, Microsoft would be admitting that they are on their level. Believe what you will, Microsoft won’t be admitting that any time soon.
Clearly the Kindle is a very intriguing idea, especially in the textbook market. For a long time people have said that eBook readers were just flat out sub par. Hard to read, poor battery life, small memory were the most common complaints, and the Kindle seems to have solved some of these (battery life is mediocre).
In my reincarnation of the eBook reader I took a different route then the Kindle. I really didn’t see the need for a keyboard, all you really need is a power button, two buttons to flip the pages. In the case of textbooks I though it would be very cool to have some highlight functionality, so add a stylus and you could highlight portions of the text.
When I look at the Kindle there are a few things that I can only hope for which would make the Kindle a useful tool. First, some form of highlight/tagging feature which allows you to select parts of text. Then once selected you can iterate through all of the highlighted items for a quick review of the most important parts. Next, if Google’s $700 stock price has taught us anything, search is key. There is a keyboard on the thing, please make use of it and allow people to search through text books. Sometimes you need a quick reminder of the truth table for a clocked d-latch, and if I cant search in the textbook everyone will just resort to searching for it online, or spending a while finding it in the book.
I read an article which touted the Kindle as the next iPod. Please. As innovative and interesting as the Kindle if it was 1/10th as popular as iPods I think it would be a success in most people’s books. Its price point has to drop significantly from $350 for it to even have a chance.
People seem perplexed by them picking Seinfeld, I mean weren’t they supposed to pick someone hip, cool, and just flat out AWESOME?! Not to be brief, but no, absolutely not. If there is one thing that Microsoft surely is not going to do it, its try to out-cool Apple. Criticize as you please, but Microsoft is not dumb, and they know that trying to out-cool Apple is a loosing battle. Now I was surprised as any when I heard the news, I didn’t even think Microsoft would have a “spokesperson” for the campaign since that is an extremely risky venture. But, if you are going to pick someone, Seinfeld is an excellent candidate. When you think about the target market, as much as they would love to go after the hipsters, Microsoft has a somewhat older market to cater too. So although Seinfeld has been out of the limelight for a little while now, those of us who are old enough to remember the show like Jerry and don’t have one bad thing to say about him. You wont find pictures of him on TMZ smoking outside a club or running over paparazzi, in fact, aside from his show, Seinfeld lives a quiet, unpublicized life.
As George said:
George: Well, I got just the thing to cheer you up. A computer!
Huh? We can check porn, and stock quotes.
I probably should bring up the episode which involves Jerry and computers, the white laptop crowd will have a field day..
Even still, I will concede the home market. If you have been paying attention you would have heard that notebooks now outnumber desktops in terms of sales and I don’t see much reversing this trend, in the home market.
The corporate market, are you kidding me? Clearly the author sees the corporate employees as a bunch of busy bees just buzzing their way around the country for months on end. Only returning home to the hive for the occasional meeting and suckle of honey (take that as you will). Perhaps Mr. Reisinger has been eating too may dinners out with his journalist cohorts because although people are traveling more, not nearly on the scale that he is implying. Without sounding too green, I might venture to say that most non-sales people probably spend about the same time on vacation then they do traveling for work. There are, however, practical uses for laptops which don’t require an airport or gas stations. Having a laptop to take around the office, into meetings, even home can be quite convenient and is probably where most laptops get their use. When you think about these three cases you will mention the fundamental struggle with laptops: Size/Portability vs. Usability. For the most part, the bigger the laptop the most usable it is (bigger screen, full sized keyboard, track pad) and the less portable it is (larger, heavier). Yes, some laptops make better use of space then others, but it’s rarely by much. Based on my empirical (and utterly unscientific) study I’ve found that the median home laptop is 15” while business laptop would be 14”. Like I said before, this is because business users travel more, albeit still not very much.
One thing that becomes apparent when looking at laptops however is that they are not even close to desktops in terms of usability. Clearly a smaller monitor has a pretty large effect on usability, but even worse is mice. Trackpads are the most popular “pointing device” in laptops and basically the only improvement we have seen in them over the past many years are the scroll bars. Multi-touch you say? Meh, wake me up when you can do something interesting. And if you have the ThinkPad “TrackPoint”? Ouch.
Wait a minute, why not just buy a docking station and you get the best of both worlds. Well you smart cookie, that’s a good point. You get the portability of a laptop since you can undock it and take it with you, and when its at your desk you can use full size peripherals. Unfortunately one other thing you will get with a dock is a lighter wallet, they do not come cheap at all. Why pay $300 for a docking station when you can get a whole new desktop for $500?
If you aren’t yet convinced that laptops wont be taking over the corporate landscape any time soon you should take a look back over this article, because you must be missing something.
With that in mind, I am baffled when I talk to people who have and buy $1500+ laptops. Before you get offended, if you require a quad-core, 13”, 2 lb. laptop (or an Apple) then by all means, close your eyes and hand over your plastic. Those crazies aside, (I knew was going to offend people) I have seen a lot of expensive laptops being purchased and they all tend to share two key things, they look sexy, and they have very mediocre spec’s. Clearly a sexy looking laptop is a very important feature, I know that when I am out about on the town that I cannot be seen without a sexy laptop to match my handbag. Now I am not going to lie to you and say that Macs/Dell XPS’s/Sony Vaio’s are ugly because they clearly not. Yet, I find it difficult to justify paying so much more for a nice looking laptop when realistically most laptops look pretty nice. Before you boil over about my mediocre spec comment I’ll explain. One would think that if you were shelling out almost twice the amount of money that you would get a beast of a machine. Not so fast, the low end models are generally WORSE then the low end of less sexy laptops. So…you pay more to get less? That is correct. But…if you want to pay MORE money then you can actually get a sexy laptop with decent specs.
Note: For those of you who are squirming out of their seats about how Macs don’t require as good specs to run decently, hold onto that thought, I’ll be there soon.
Realistically, if they cant convince a significant portion of more technical users to ask for it, or use it, then its pretty clear why they wouldn't include it by default. I guess those of us that want it will have to rely on third party tools to implement them for us, although so far I have not really been happy with what I have seen. Guess I will have to continue looking harder.
1. Browser w/ email, sports page, slashdot
2. gEdit, terminal, browser for programming reference
3. Either a different programming window (like #2) or a paper I'm writing or reading
4. Usually empty, otherwise it could be another space like 1-3
Now initially it doesn't seem like much, you could easily navigate around a Windows desktop with those Windows open. So what the big deal? Separation. Having email/sports/slashdot open, refreshing in the background, is just too tempting. All too often I find myself flipping over to that window just distract myself with Slashdot for a while.
So Linux has workspaces, and as of recently, OSX now has them as well, that only leaves Windows.
There are mainly a few arguments for why multiple workspaces are worthwhile. First, many users (myself included) find that workspaces improve productivity greatly. In the jobs I have worked at I often have seen co-workers with TONS of windows open and they have quite a time managing them. For those of us who don't have multi-monitor setups (or even if we do), workspaces provide a lot of extra room to work with.
Now if they are all I have cracked them up to be, why wouldn't Microsoft jump through rings of fire to make them? Well there are a few reasons why I would see them not doing this. First, there is probably a large population of Windows users who would likely not use them. Think about the grandmas and moms who use their computers to check email and look at pictures of their kids online, they either wont know about them, or not seen any use for them. This doesn't seem that bad since they would effectively be in the same position as they are now since they would not use the workspaces. It turns out not to be so simple, one could imagine how many times windows would magically "disappear" to other workspaces and they would have no idea how to get them back. Given that number of users who use Windows on a daily basis, the number of times this would happen would be astronomical.
I think a reasonable solution to this would be to only include workspaces in upgraded Windows editions, so leave it out of "Home" editions but include it in "Professional" or "Ultimate" editions. In doing this, workspaces would be provided to business and power users, while home users wouldn't have to deal with them, everyone is happy.
Originally I wrote about some of the patent issues that have come up with Red Hat and Apple from IP Solutions, and how that would affect Microsoft's decision to include that feature. In looking into it there seems there is more going then I originally noticed, so I'm going to do some homework on the topic before I comment here.
Tune in next time when I talk about how file sharing and BitTorrent help Web 2.0 technologies
I think I am part of a very large contingent of people who where (and still are) awaiting the release of the iPod touch V2. Since it was first released I knew that the touch was an incredible product and enviously played with a few of them. Music, of course, video, excellent, and just to top it off Wifi. Being able to connect to Wifi hot spots at great speeds and not having to pay a cell provider a bunch of money for a limited service are huge plusses for the touch. Yet there were a few things that held me back from jumping on one from when it first came out was space, and cost. Upon its release the touch came out in 8 and 16GB sizes which carried the price tags of 300 and 400 respectively. In talking to my compatriot Phil we both came upon the same conclusion, its simply not enough space. 16 gigs on a device which can store video, including full length movies, is insanely small. This was coming from the guy who was having trouble squeezing just his music onto a 20GB iPod photo. I decided that I would wait till the second round of iPod touches since they would increase capacities to 32 and hopefully 64 GB and make some software improvements.
I was really disapointed when I found out they were going to release a bigger 32GB touch, but it would still be the first gen. Mainly, they added this on top of the other two sizes, without a price drop, which left the largest model with a hefty price tag of $500. This also meant that they would not be releasing a touch V2 anytime soon.
Fast forward to today, and I am still torn about what to do about it. Surprisingly, Apple has yet to drop the 8GB touch which means prices have stayed the same. To pour a little fire on it, Apple has continued to stick it to touch users by charging them "nominal" fees for software updates. There have been (I believe) three updates so far which have been free for iPhone users, which touch users have had to pay for. Why Apple? I don't see how Apple can justify charging touch users but not iPhone users, either charge both or none.
I guess for now I'll continue to wait and hang on to my old school iPod. Perhaps soon enough Apple will decide to lower prices on touches and toss in those updates for free and then perhaps ill change my tune and put my money where my mouth is.
The main purpose for this blog is simply a place to write my own take on many of the things I see and read about in the tech world. Whether its ideas I have been thinking about or comments on things others have written it ends up here on my blog. I welcome comments or opinions on any posts I have made since these ideas are always up for debate.
More information can be found on my website jkupferman.com.
- ► 2010 (14)
- ► 2009 (31)
- ► November (3)
- ► October (5)
- ► September (7)
- ► August (4)
- ► June (4)
- (14) microsoft
- (11) Google
- (9) Amazon
- (9) cloud computing
- (8) windows
- (6) apple
- (5) amazon ec2
- (5) linux
- (4) Rails
- (4) Ruby on Rails
- (4) advertising
- (4) design
- (4) laptop
- (4) python
- (4) web performance
- (3) Blog
- (3) Caching
- (3) Facebook
- (3) Gmail
- (3) Internet Explorer
- (3) Kindle
- (3) Performance
- (3) Platform-as-a-Service
- (3) RSS
- (3) RSS Readers
- (3) Rackspace
- (3) Ruby
- (3) Seinfeld
- (3) Yahoo
- (3) web design
- (2) Andriod
- (2) Azure
- (2) Bloggers
- (2) CSS
- (2) Chrome
- (2) College
- (2) Downtime
- (2) Firefox
- (2) G1
- (2) Gates
- (2) Google Andriod
- (2) Google Chrome
- (2) HTML
- (2) Jeff Bezos
- (2) Memcached
- (2) MySQL
- (2) Relational Database
- (2) Slashdot
- (2) Software
- (2) Twitter
- (2) Website Performance
- (2) bing
- (2) data storage
- (2) dell
- (2) google app
- (2) iPhone
- (2) interview questions
- (2) ipod
- (2) programming
- (2) search engine
- (2) torrent
- (2) usability
- (2) vista
- (2) web applications