Showing posts with label computers. Show all posts
Showing posts with label computers. Show all posts

Wednesday, August 10, 2011

Password Security - Summary

The idea of using a password is thousands of years old, but today it is most commonly associated with computer authentication. Passwords are kept secret and used to prove the identity of a user on a computer system. Today passwords are used more than any other time in history. Almost every person in the United States has a password of some sort tied to a computer system. This includes new uses such as passwords associated with mobile technology.

Overwhelmed with passwords, many people fall victim to bad habits that weaken the security the passwords were intended to provide. Users often use weak passwords because they are easier to remember and reuse passwords across multiple services. Services often fail to address these problems as well. Poor practices for password storage and site security allow malicious users to access password databases, putting all users of the service at risk.

The people trying to gain access to your passwords are better known as crackers. A cracker might use your password to access the site of origin, but often their motivation for stealing passwords is to gain access to other services. Crackers have a number of tools at their disposal to combat modern password safety measures such as rainbow tables, key loggers, man-in-the-middle attacks, and social engineering attacks such as phishing.

There are ways to protect users against many of these techniques. Services can follow best practices for storing passwords and authenticating users. Alternatives to passwords exist. Users can pick better passwords. Software can help users manage their passwords securely. Some effort is required, but users can combine both convenience and security.

Next time I will introduce passwords with some history and the concept of authentication.

Password Security - Foreword

Computer security is an important topic for me. While I don't consider myself to be an expert, and I know several people who are more committed to secure computing than I am, I still am very interested in the topic. I try to take a pragmatic approach to security where every decision I make considers risk, reward, and cost. In the last few years I have become increasingly concerned over my password security habits, and more so over the habits of others. So much so, that when the opportunity arose I chose to study it and write at length about it.

Earlier this year I took a course in technical writing. The design of the course was fairly neat. Students had to pick the topic for their final paper at the beginning of the course. Each week students had to hand in a writing assignment related to this topic. At the end of the course the final paper largely consisted of the previous assignments with some additional content to glue it together. As you can guess, my topic was password security.

I was inspired by recent high profile hacks and password leaks, as well as my recent switch to using a cloud-based password manager. My report was based largely around these events, though the final version included less examples than I originally intended. It is, to some extent, an elevator pitch to attempt to convince others that there is a real danger in insecure password practices.

Of course, a pitch that is never presented has no chance of success. To date, probably only four or five people have read my paper, and at least one of those people learned nothing from it. So, in the spirit of both my efforts to contribute to the world via my school work and to help get the word out that these practices must stop, I will be splitting up my paper into several blog posts to share with anyone who will read it. I will attempt to add value to the paper where possible, such as links to reference articles and examples that I could not fit in the original.

Thursday, July 14, 2011

Google Plus: Modeling Real Life Social Interactions

While the service may be in its infancy, I think G+ shows some real promise. Of particular interest to me is the ways in which the Circles feature models social interactions from the real world. I believe that in this aspect it is far better than Facebook, though it will take time for these interactions to click with users.

Circles: Just Like Real Social Circles

Quickly, think of one of your real life social circles. For most people, they will think of a group of friends, coworkers, or family that is tight knit and perhaps share some commonality. In many circumstances, these people will show up in multiple circles. G+ models this perfectly. You can easily take a person and put them in as many circles as you'd like.

Again much like real life, that person doesn't know that you consider them part of a particular circle unless they know implicitly or you tell them. That person may be on your "frenemies" list. Perhaps you only consider them to be an acquaintance (we'll get back to this) but you don't want them to get the wrong idea that you're keeping them at arms length.

You may never use this feature to its full potential, but one of the aspects of social software is that it allows you to organize and catalog your life in new and novel ways. The implementation on G+ is both easy and visual. You may learn something about how you think of your friends by attempting to put them into circles.

Sharing Controls Allow More Frank Conversations

When you share something on G+ a key feature is that you can easily limit or expand the scope of sharing. I know some people consider this to be a confusing extra step but it is necessary to model these interactions. It allows people to conveniently have separate social circles that need not often interact.

Think about it, is your mom or boss on Facebook? This answer is increasingly "yes." With Facebook's privacy settings it is complicated to avoid sharing sensitive information with these people. It's likely none of your boss's business that you were out partying all weekend, but it is so easy to inadvertently tell her just that. In order to avoid this you must either not befriend these people on Facebook (smart, but sometimes awkward) or go through a fairly unintuitive procedure to modify who can see a particular post. It's not impossible, in fact I have custom security settings that keep several people who are officially "friends" from seeing the content on my wall, but it is nowhere near as intuitive nor as central as it is on G+.

As an aside, I think that Google is placing their bets that by enabling you to have more control over who sees what info you will in turn feel more comfortable sharing things. If that becomes true then people who share relatively little now could find new life in a product like this. Also, I should note, the addition control is not absolute. Just like in the real world, if you say something to anyone then that person has the ability to share that information. Digital communications are easier to copy and verify, so it's not like this would give you carte blanche to trash talk your employer or openly cheat on your spouse.

Dealing With Acquaintances and Beyond

The way Circles work will allow for far less awkward interactions with people you don't know or don't know well. If a random person adds you to a Circle, you can simply ignore it and they will only see public posts. Any posts they share with you will go to your Incoming page. You simply don't have to see those people, and it requires no action on your part. Sure, you can block them if you really want. A better strategy would be to simply treat public posts as you would any other public speech and not say anything too personal or socially unacceptable and you don't have to do anything about them.

Say that guy you met at the party last weekend adds you, and you might share some stuff with him based on what you know about him but you don't want him to know too much about your personal life, then file him under Acquaintances. When you share personal info don't share it with Acquaintances. Or create another group that's even less intimate. Chances are that most sharing of this sort barely has a real world model because many acquaintances don't have frequent interactions after the fact. So even if you never share anything with these people you shouldn't feel bad about.

A Conversation With a Circle

In the real world it is unlikely you will have a chance to talk to people from all of your social circles at once more than a few times in your life. The one time this is likely to happen, at your wedding, is something that many people only have happen once, and others only a few times. Good luck getting your coworkers to buy you a fourth wedding gift.

Instead, you probably have conversations with a one circle of friends at a time. It's likely that you tell these groups many of the same things, but you probably choose not to tell certain groups certain things, and other times you probably change things slightly to match the group. Each group likely reacts differently, even if similarly, to the same conversation. Sometimes, you would tell two people the same thing, but not when they are in the same room. That's how social dynamics work. It's a dynamic that Facebook breaks and G+ models somewhat correctly.

For some people, Facebook has changed this social dynamic forever. Any public announcement will be just that: public for all and for all to comment on. They probably value the varied interactions of their different circles of friends meshing together. Fortunately for these people, G+ offers the "My Circles" and "My Extended Circles" sharing settings, not to mention "Public".

For everyone else, the genie can be put back in the bottle. If you have a conversation with one group of friends no one else need know. You can have the same conversation multiple times shared with multiple groups and avoid any interaction.

Why would you want to do this? Well, maybe you want to give your close friends a low down on your trip to Cancun, but you want to share photos with your family. You don't want your rowdy friends commenting where Granny can read. Or, maybe you know people from Ohio and Michigan and you want to discuss the fine mess that OSU's football program got itself into but you'd rather it not become a huge flame war.

A Conversation From a Circle

Here's another key difference. Right now G+ does not have a "wall" that anyone can write on. Some people think this is terrible, others love it. I like it because it gives me control over who sees what my friends say to me. However, the real benefit of this is that it models how interactions from a circle of friends to you work in the real world.

If you're hanging out with a circle of friends one of them might say something to you that everyone in the circle can hear. This could strike up a conversation within the circle, and maybe it's a story you would recount later to others but people outside the circle would not likely be involved.

How you model this in G+ is to make a post directed at your circle and tag the person you are speaking to. This will allow your mutual friends to comment on this post. If your friend wants to share it more broadly he can do so by clicking Share and selecting more of his circles. By sharing it with your mutual circle of friends you can have the same sort of intimate, candid conversations you would have in the real world. If it's something you want more people to talk about you retell it by sharing, the same as you would need to otherwise.

This again empowers you to control who sees what information. If you think about Facebook's wall, the idea of allowing someone to write on your wall is like asking for someone to write graffiti on your house or draw a penis on your forehead. Sure, it's also like having them sign a cast, but even then they normally ask permission. Think of resharing as your wall plus asking permission.

Public Speaking

Public speaking is something that Twitter does pretty well. Conversations on Twitter are so disjointed that it is more a broadcast platform than anything else. Of course, conversational discourse is kneecapped on Twitter due to the size limitation. Facebook makes most of the things you say into a semi-public event that is invite only. Unless your profile is open to the public only your friends will see it, but then those people not in your friends list can't interact with it. G+ is modeled a little bit after both services, allowing you to have both private and public conversations. However, G+'s public conversations are far superior to Twitter and more shareable than Facebook.

Anything you post that is aimed at the Public should be considered to be something of a seminar. It's like gathering all of your friends, acquaintances, fans, etc. into a big room and offering for anyone to comment. You can assume that this will be fairly public, as it is tied to Google after all, but the people who will immediately know about it are the ones who have you in their circles. Thus, you practically have an attendance roster right on your Circles page. Unless you disable comments then you can allow public interaction on these items, basically anyone with a G+ account can comment.

Getting Along with G+, Acceptance & Adoption

When we deal with a new service like this one we must be careful. Some people will proclaim it the next big thing, others will call it DOA, and still others will begrudgingly drag themselves along for the ride. We'll recall Google Wave (over and over again) and Orkut. We'll think of MySpace, which is funny because it was a huge success that just didn't have staying power. Maybe we'll think of all the other projects Google has done that no one would give a chance to yet have proven to be popular over time, like GMail, Google Maps, and Android.

Chances are that people like me will be more lenient on the service. I don't mean because of the reasons laid out above, but rather that I tend to love Google interfaces. Even their quirks often agree with me. I try to cheque my fanboyism and be objective. Certainly, as someone who does interface design for a living I can be critical of their choices. Still, it works for me for the most part.

It's also important to remember that this service was launched early in the development stages. It is clear that they intend to follow their pattern of rapid iterations and live testing. Google is capable of developing slick interfaces that work well, but often their first generation is somewhat clunky and pointedly favors geek culture with features like keyboard shortcuts. If you're not so much of a geek (or sometimes if you're just that much of a geek) then you won't appreciate this as much as people like me.

I'm sure there are more ways that this service both mimics and deviates from real life social interactions. After all, it is a piece of software and it does do things that are impossible or difficult to physically accomplish, like bringing together people from geographically divergent places. However, I'm not exactly a social scientist nor will I proclaim myself to be a social media expert. This is all I've thought of up to now and it was inspired by several conversations with various friends. This may not be the last I write on the subject, I only hope that the next thing I write isn't a post-mortem.

Thursday, August 19, 2010

Windows Server 2008 R2 and COM Objects

So we just went through a huge ordeal while trying to decommission an old server and move a legacy website onto a new one. The old server was a 32 bit Windows Server 2003 machine, the new one is 64 bit Windows Server 2008 R2. The website is classic ASP that uses Dimac's JMail control and SoftArtisan's FileUp control, which are both 32 bit.

The result was several HTTP 500 errors and a relatively generic set of log messages. Actually we weren't always seeing anything in the logs. It was fairly perplexing. When we did get error messages they were like this:


800a01ad|ActiveX_component_can't_create_object

8000ffff|Enque:_Error__no_pickupdirectory_found.

80004005|[Microsoft][ODBC_Driver_Manager]_Data_source_name_not_found_and_no_default_driver_specified

The first part of the solution was to set the application pool for that website to 32 bit. You do this by opening IIS Manager, select Application Pools, select the application pool you're modifying, in the Actions pane click Advanced Settings..., then set Enable 32-Bit Applications to True. After you've done that, reset IIS and try your site. If the site still isn't working, reboot your machine. Apparently Windows isn't all that keen on switching between 64 and 32 bit, so sometimes a reboot is in order. [We had to.]

Maybe that will fix your problems, it didn't for us. In fact, the ODBC error seems to stem from this action. Apparently when you create an ODBC DSN it's only available to 64 bit processes. You have to create the DSN in the 32 bit space by using the 32 bit Data Source Administrator from the SysWOW64 directory [odbcad32.exe].

We still had those pesky JMail errors. The object was created successfully but we were getting File I/O errors. We followed advice that suggested we should copy the DLL to the SysWOW64 directory and register it there. No change. We modified the permissions on the DLL to allow everyone to read and execute it. No change.

What we hadn't thought of yet was to modify permissions on the SMTP pickup folder. We didn't think of this because the test page we'd created had everything but the instantiation commented. Still, this simple change made all the difference. For whatever reason, giving write permission to the IUSR didn't fix this. We tried a few other users before we gave up and just gave all local users write permission to the pickup folder. This did the trick.

Overall this was a royal pain. That's why I stopped to do this quick write up. I hope it helps.

Thursday, April 15, 2010

Feature Creep: The Enemy of an In-House Developer

If you do in-house development then you probably have first-hand knowledge of feature creep. If you don't know what that is, or you haven't seen it happen, then I envy you. It's an ugly monster and the bane of my work. The problem is not limited to in-house developers; it is particularly acute in that environment, though.

This is because development cost is a taboo topic in most in-house development environments. The developers and their immediate management don't want to talk about it because they don't want to remind senior management that in-house development is expensive. Senior management is happy to ignore it because they crave the finite control, costs be damned. So a game is played to balance the quality of the product with costs, where the quality is generally lower than a prepackaged system and the cost is generally higher. In general, this works fine. It helps keep developers employed and management happy.

The real problems start as the customization culture trickles down the management chain. Depending on how vertical the structure of your company is, you may have dozens of management steps between the top and the developers. Typically, when software is not a primary focus of the company, your developers and their managers will be quite low on the org chart. This puts them at a disadvantage when dealing with almost any manager and makes saying no quite difficult.

Therein you have the perfect storm: A situation where discussion about cost is a taboo, but the number of powerful voices calling for increased cost is huge. It's worse yet, though. When management's focus is not on software but instead specifically on how to make software work for them (as in, each manager personally) there is often little to no concern about the cost of feature creep to usability. In fact, usability is typically not in a non-technical manager's vocabulary. Outside of the developer group no one cares about usability, and often inside the developer group it is neglected because of the notion that client doesn't care.

Unfortunately, even if the user thinks they don't care about usability they really do. The difference between great software and passable software is often usability. The difference between a truly happy client and a client who is merely happy that development is done is usually usability. Software is supposed to solve a problem, to help a user achieve a goal. If it is too difficult to use it creates more problems than it solves, or it hinders the users from achieving their goals. At that point it should be considered a failure, though reality shows that this rarely happens.

Failed development projects, whether they are recognized or not, are dangerous. They put developers on shaky ground. They cause managers to think of the taboo of their in-house development: cost. If they aren't thrilled with the product then they will be far more likely to consider whether it is worth the money. The only thing that keeps this from happening is that they are often too egotistical to admit failures, but eventually if feature creep continues they will come around. When they do they will externalize the failure to the development group, after all it is their job to make this software and if it's so much more expensive then it should be better, right?

Wrong. There is little connection between cost and quality in software. Devs don't dare tell a manager this, except maybe as a last resort. Lest he add things together and realize that he could put company resources to better use.

Of course, most developers actually do want to put resources to better use. With less pet projects on their plate most developers will try to automate processes to save the company money. They'll refine existing systems to make them more efficient. In this aspect, in-house developers yearn to be more like system and network administrators; if you can't tell I'm there then that means I'm doing my job. Occasionally they might venture off course to test some new technology or try to solve a particularly complex problem, but for the most part a dedicated in-house developer is happy with the sense of accomplishment that comes when he or she knows that their product made a truly positive impact.

Developers have a responsibility to fight against feature creep. Don't buy into the false mantra that your job is to do as you're told. Your job, at any level of any company, is to act in the company's interest. That means to be truthful about costs and try to help management make the right decisions. There is no room for complacency in this. Feature creep is the developer's enemy and it is our duty to fight.

Monday, January 4, 2010

Two Points Each, Mac and PC

This is part of a series of reprints from my classes. Once the class is over, I will lose these if I don't save them elsewhere. I've decided to post them here as they may be of some interest. This is from my Introduction to Information Systems class, which I was too lazy to test out of.

Advantages of a Mac:
  1. Compatibility. While the general thinking is that Windows machines have the most software and the world works on Windows, so you can do anything in it, I believe this is a short sighted non-technical viewpoint. The reality is that Microsoft makes very little attempt to support standard formats, instead they spend time developing their own formats which then become de facto standards. Despite Microsoft’s practice of stunting compatibility using patents and copyright claims, Mac owners enjoy the ability to open most file formats without the need for additional software. Windows users have to install third party applications to open standard formats such as PDF.

  2. “It just works.” When Mac users say this they aren’t talking about the lack of viruses and other malicious programs. They’re talking about usability, compatibility, and stability. Due largely to Apple’s software and hardware philosophies, Mac owners enjoy a relatively hassle free experience. The interface is highly consistent and few developers of software for Macs stray from the conventions. Apple’s control of the hardware means few driver issues ever arise. Their design standards for the hardware add another element of usability. Once you learn to use OS X you rarely have to think about it, it just works.

Advantages of a PC:
  1. Ubiquity. Microsoft was able to grab the business market. This is the true key to their success, without the exposure and indoctrination of millions via the workplace the home computer revolution of the 90’s would not have been possible. Now that everyone knows how to use Windows they find themselves confused by the design differences between the PC and the Mac, which can make short stints on a Mac frustrating. Also, since the market share is so skewed in favor of Windows the impetus is on users of other operating systems to make sure that Windows users can interact with their content. This is a distinct advantage for Microsoft and to some extent the users of its OS. While Microsoft is free to create file formats like WMA with limited implementations outside of Windows, other vendors are not afforded such a luxury, and the users don’t have to worry about receiving a deliverable in a format they can’t open.

  2. Availability. Windows is also highly available, and so are the devices on which it runs. You can get a Windows PC easily and cheaply. The lack of vendor lock-in for hardware means that manufacturers are able to race to the bottom on price. This means that Windows PCs are available to a wider audience and they are infinitely configurable.

Thursday, December 24, 2009

Christmas Cable Management

Stop right there! Don't throw away all of that useful packaging! As my gift to you, I'm going to show you how to reuse those annoying twist ties that are included in virtually every toy you buy.

I have a two year old son. He's having a great Christmas. After I took a few gifts out of the original packaging I was left with a mess of twist ties and little plastic anchors.



These ties are used to keep items securely fastened while allowing a clear view and even a trial touch of the item inside. I wasn't originally a fan of these, though they are at least less annoying than bubble packaging. Now I save them whenever I get them. Why? I figured out that they are perfect for cable management solutions.



The first use I realized was that the ties are strong enough that you can wrap them around cables to keep them together and you don't even need to tie them after. Just wrap around a few wires until you run out of tie. The wire is fairly strong and somewhat stiff, though the gauge varies with each product.



Tonight I realized how useful the plastic anchors could be. They have holes in them for the wire ties to pass through and anchor points to wrap the wire around. They come in varying sizes so you could use different ones at different points to keep your cables orderly. As you can see, I've put the wire through backwards so that the cable I'm tying down sits between the anchor points, then I run the wire tie around the back and wrap it in a figure eight to lock it down. On most of these there's even an extra set of holes on the ends, you could use this to lock down another cable or you could drive a tack or screw through it to secure the anchor to a wall or desk.

The great thing about these is that they're free and they work very well. They hold nicely but they're easy to undo. You can reuse them, too. If you're a an environmentally conscious geek father like me these are a great solution to a couple of problems.

Monday, December 21, 2009

A Group is Its Own Worst Enemy

This is part of a series of reprints from my classes. Once the class is over, I will lose these if I don't save them elsewhere. I've decided to post them here as they may be of some interest. I was trying to write about my favorite piece of writing. I succeeded and I failed. I do love Shirky's piece but I've thought since that I should have selected another work.

On April 29, 2003 Clay Shirky gave the keynote at the O’Reilly Emerging Technology conference in Santa Clara. I wasn’t there. Fortunately, Mr. Shirky saw fit to post the text of his A Group is Its Own Worst Enemy keynote online shortly thereafter. A few years later, on the recommendation of a friend, I read this for the first time. I’ve returned to it frequently since.

The topic of Shirky’s piece is social networking. He commented on this shortly before the massive explosion of self-aware networking sites and just as blogging was becoming a mainstream concept. While the topic was hardly ahead of its time, many of the focal points were of the distant past. He saw fit to remind everyone that group dynamics and human interaction are nothing new. Neither, it would seem, are the troubles that social software operators encounter as those group dynamics are at work.

He begins by explaining the title and its origins. He explains the story of W. R. Bion, a Psychologist who published the results of a study in his paper Experiences in Groups in 1950 about a group of neurotics. It is Shirky’s opinion -- if not Bion’s, I have not read that paper -- that we can determine many behavioral patterns of a group from this study. He explains using parables of Internet communities that have long since passed, most notably “LambaMOO.” Then he explores the question of “why?” social networking is about to explode. While he continues in-depth on the subject he begins this analysis with the conclusion: because it’s time. In retrospect we can see how right he was. Still, it’s enlightening to see that moment captured and understand how everything started to come together.

Lastly, he offers advice on what not to do if you are running a community and what you may want to plan for at the onset. As someone who has participated in numerous online communities and created a few this is almost sacred text. Yet, I believe that most participants in communities could benefit from this thousand-foot view of how they operate.

I find myself drawn to this text so strongly because it all rings true to me and many of the topics are ideas that I have expressed at some time or another. Shirky brings everything together with great style, though. His words are straightforward and mostly simple. He balances heavy content with friendly presentation that does little to scare away the non-technical reader. I believe the true power of A Group is Its Own Worst Enemy is in the ability to make any member of a group more aware of the role they play. In some cases, they may not realize that they are part of that group at all. I think the most important audience for this, though, are those who seek to create, run, or oversee a group. For that audience I believe that this should be required reading.

Monday, November 23, 2009

Backups

This is part of a series of reprints from my classes. Once the class is over, I will lose these if I don't save them elsewhere. I've decided to post them here as they may be of some interest. This is from my Introduction to Information Systems class, which I was too lazy to test out of.

I don't backup my computer, per se, so much as I backup important content. Things like photos, videos, and documents I replicate between multiple computers. As much as reasonable I try to also put these files into the cloud using one service or another. For instance: many of my photos are on multiple computers, uploaded to Picasa web, and also uploaded to Facebook. I store almost all of my documents in my Dropbox account, which stores the documents on their servers and automatically replicates them to multiple computers.

I admit that this is not an entirely adequate backup solution, but it's worked quite well so far. Two years ago my laptop died. I was able to recover the hard disk from it, only to find that I didn't need any of the data off of it. Last year the hard disk in my wife's computer died and almost nothing was lost. Even though I've been fairly lucky, I am working to rectify the situation with some more formalized and complete backups.

At work I save everything onto network storage which is backed up using NetApp's SnapVault.

Friday, November 20, 2009

Monitoring Internet Usage

This is part of a series of reprints from my classes. Once the class is over, I will lose these if I don't save them elsewhere. I've decided to post them here as they may be of some interest. This is from my Introduction to Information Systems class, which I was too lazy to test out of.

I'm not against monitoring internet usage in the workplace. I am against ham-handed management of people's communication. Often I find that this argument is oversimplified: You're at work to work, and you can't possibly be working if you're online on an outside instant messenger or checking your email. This argument ignores all other factors, such as lengthened work-weeks and jobs where productivity is held at a higher value. I think that a situational approach and proper management are required, no monitoring technology can replace these.

This boils down to metrics: Is someone's productivity and worth to an organization measured in how much time they spend online? I posit that it is not. You are not paid to not use the Internet, you're paid to perform certain tasks. Depending on the type of tasks and the expectations of your employer, this may preclude using the Internet, but it likely does not. Instead we should judge employees based on their ability to get their job done, only when they fall short of that should we question how they use their time.

Work/life balance is another issue to consider. When employers ask increasingly more time out of their workers’ lives they should expect a compulsion to bring the home life back into the mix to find a better balance. This is especially true when it comes to IM where that communication can be vital to maintaining healthy home relationships. It can also be said that the workplace continually creeps into home life. How is IM unacceptable at work when BlackBerries are required to be on at home? Again, this is about balance and it will vary individually. The employee who works minimal hours has less claim to this than the one who works dozens of hours overtime and some employees allow their home lives to affect their work. Managers should deal with these employees individually and realize that their Internet usage may not be a particularly useful metric to fixing the problem.

Lastly, I will side with employers from a Human Resources perspective. I think this is where monitoring, and even filtering, is important. Employees should know they are being monitored and they should have a few clear usage guidelines for the Internet. It may be acceptable to communicate with your family and friends, but not everything is acceptable to do from work. Companies need to take a zero tolerance stance on pornography, discriminatory practices (take for instance the Human Rights Watch worker that was recently found to post on Nazi bulletin boards), harassment, industry secrets, etc.. Such offenses should be taken extremely seriously and should be actively monitored. Employees that cross the line should be dealt with immediately. Policies like this should be clearly stated, though.

Schools are somewhat similar. I think there, since you’re likely not dealing with adults, you should be a little more proactive in monitoring and stopping abuse of technology. I see most of this as twenty first century note passing. Other content should be filtered, though pretty much any filter can be broken. This is still a situation where filtering and monitoring will not take the place of parenting and teaching. If a child is struggling you might look at abuse of technology as a contributing factor, but it is dangerous to assume that it is the definitive factor and even more dangerous to act on such an assumption without considering how it may effect the child.

With parenting, I think that young children should be monitored closely. This isn’t to say that I’m afraid of what they might see or who they might talk to. It is that they are far more likely not to understand, to take things wrong, and to make poor assumptions about what they’re seeing. I don’t want my child reading a hate website unattended, lest they believe such foolishness is true. I don’t want them to use social networking sites unattended, more because of cruelty like that of the Lori Drew case than worry over someone appearing on To Catch a Predator. The younger the child the more help they need with interpreting the situation Eventually they grow older at which time I would scale back monitoring only to avoid more serious problems such as lawsuits over infringement. Though such things may be a little easier to block than to monitor.

Wednesday, November 18, 2009

Operating Systems

This is part of a series of reprints from my classes. Once the class is over, I will lose these if I don't save them elsewhere. I've decided to post them here as they may be of some interest. This is from my Introduction to Information Systems class, which I was too lazy to test out of.

I have a really funny story about how I crashed the VAX server at my father's work in 1983, but I'm going to spare you. Instead, I'm going to focus on operating systems that I have more direct, coherent interactions with. I'll try to do this in chronological order.

MS-DOS 4, 5 - My first experience really managing a system. High memory, what a throwback.
Windows 3 - I strongly preferred DOS to Windows at this stage, I thought of it as a gimmick and totally unstable.
MS-DOS 6.22 - The high point for DOS. Improved memory management (though I remember we used QEMM) and disk management. Good stuff. Anyone for a modem game of Doom?
Windows 3.1 -That minor revision made a big difference. Also, the business world started to catch on with Windows so more utilities came out. This was about the time I started using the Internet, and compared to today it was absolutely terrible. I still preferred DOS when I could use it.
Windows 95 - It was such a big deal, and it was a huge improvement. I think it was slightly overhyped. Ultimately I found myself still going into DOS for a lot of things.
Windows 98 SE - I think that this high point of the Windows 9x line. We waited until Second Edition was released before we upgraded. It required some initial work to make it run well, but after that it was rock solid.
OS 9 - I helped a few friends that had this work on their computers. It was neat, but I absolutely hated it. It was so difficult to do any maintenance to the system and everything was so slow.
Windows Me - Oh my, what a disaster. I don't recall any useful feature upgrades from 98 SE but it seems that Microsoft tried to do too much with the 9x code base. It didn't work, this was the most unstable and unusable OS I've ever experienced.
Windows NT - I have limited experience with this, as I switched employers and they were on the verge of upgrading to 2000. Still, I used it. It was largely unremarkable.
Windows 2000 - By combining the architecture of the NT series with the better UI of the 9x series, this was a huge improvement over everything out there. I'm less thrilled with 2000 server.
Windows XP - I remember how excited I was that the better architecture of the NT series would be available to home users. On the down side, Microsoft created a highly networked OS that largely ignored all of the security lessons learned in the Unix community, which lead to rampant viruses and onslaughts of malicious software that continue to this day.
FreeBSD 5 - This was the best server OS I've used. It was highly stable, great performance, and Ports is awesome. I was able to do so many various things with this system it's hard to believe. I regret switching later.
VectorLinux - After inheriting a relatively ancient laptop I was able to use this Slackware varient to get it working. It has a tiny footprint but provides little in the way of ease of use.
Gentoo Linux - When it came time to replace my FreeBSD machine I chose this OS. It had more active development and great documentation. Unfortunately it also had days of compiling and eventually dependency problems.
Windows XP MCE - This was the best version of XP. It has a slightly better UI and just the right mix of enabled features to allow the home user to get things done. Specifically, I liked that it had IIS so I could do ASP.Net development without a hack at home.
Ubuntu Linux - This is by far the best that Linux has for the home user. Setup is a breeze and it recognizes tons of hardware. Of course, using Linux can be quirky and this one comes so close to being complete that it's a let down when something that "requires" a Microsoft product forces you to stop using it.
Windows Server 2003 - Good improvements over 2000, I like IIS 6.
Vista - I used this a few times. What I saw was that Microsoft tried to fix the security problems they've had and overshot creating this annoying system of prompt after prompt after prompt. I noticed that after a weke of using this OS most users would dismiss any and all dialog prompts without so much as a glance. They shifted the security problem from systemic to psychological. It was enough to tip the scales for me to buy a Mac.
OS X Leopard, Snow Leopard - I'll admit that I waited a long time to really try out OS X. I knew what it was like from years in the industry. OS X gets so very much right, and with a few tweaks it's an absolute dream to use. Most things in OS X just work, the usability of the OS is great, and I don't have to jump through hoops to get it to work with most of my stuff.
Windows Server 2008 - I only recently started using this. I'm not sure I've seen a huge improvement over 2003, especially in the management interface which I haven't gotten the hang of yet.
Windows 7 - For the first time since OS X was released it seems that Microsoft has taken the lead in usability. The security problems seem to finally be fixed, there's a clear point where you have to tell the system that you want to be an administrator but you're normally just a user. I'm very excited for this and I can't wait for Apple to truly respond.

Monday, November 16, 2009

Favorite and Daily Use Applications

This is part of a series of reprints from my classes. Once the class is over, I will lose these if I don't save them elsewhere. I've decided to post them here as they may be of some interest. This is from my Introduction to Information Systems class, which I was too lazy to test out of.

My favorite application is Google’s Picasa. Picasa is a great photo organizer that allows you to effortlessly move photos from your digital camera to the web or print. It has most of the tools needed to clean up an image, and they’re all very easy to use. Yes, I could always open Photoshop, and sometimes I do use it for a particularly troubled image, but normally that is akin to driving a finishing nail with a sledgehammer. Picasa is powerful enough that it works for more advanced users, but simple enough that even a novice can use it with relative ease. That range of usability is extremely impressive.

However, I do not have a need to use Picasa daily. My favorite daily application is Firefox. The best thing about Firefox is that it just works on almost any platform. It doesn’t matter if I’m on my work computer, my Macbook, my Windows 7 machine, or a Linux installation. All of them have Firefox and it works with very little deviation in function. This ubiquity has led me to use more in-browser applications as substitutes for desktop applications, such as GMail instead of Outlook, or Google Docs instead of Excel.

Since I use so many computers, another application I would be lost without is Dropbox. Again, it’s cross platform, and again it just works. Dropbox creates a folder in your profile that it monitors for changes. When you add a new file to that folder it uploads the file to the Dropbox server. Once the file is uploaded, your other computers will download that file immediately if they are online, or upon the next login. Also, you can login to the Dropbox website and access those files from any computer. It’s far more convenient than carrying a thumb drive. Did I mention that it's free for up to 2GB of storage? Well, it is.

Other daily applications include Microsoft’s Outlook (for work email), Visual Studio 2008, and SQL Server Management Studio 2008. At home on my Mac I use Quicksilver, which is basically the best application launcher ever, and I proof most of my work in Pages.

Thursday, November 12, 2009

My Computers and Mac vs PC

This is part of a series of reprints from my classes. Once the class is over, I will lose these if I don't save them elsewhere. I've decided to post them here as they may be of some interest. This is from my Introduction to Information Systems class, which I was too lazy to test out of. While working on this I compiled a list of my computers that I posted earlier.

About 50 hours a week I use my work machine, a Dell Precision M65 laptop running Windows XP Professional SP2. When I’m at home I primarily use an Apple MacBook White laptop running OS X 10.6 (Snow Leopard). Often, I connect into my file server, which is a Dell Optiplex 170L desktop running Windows XP Professional SP3. The file server, which I have dubbed “Kowalski,” is in my office and does not have a keyboard, mouse, or monitor attached to it. Other than hosting copies of my media files, it also serves as secondary desktop and I use it as a print server.

Other machines in my house include my wife’s Lenovo S10 netbook, a Dell Dimension 4700 desktop running Windows 7, and a Compaq Armada 1500 laptop from 1996 running VectorLinux that I saved for my toddler to play with. Roughly 6 years ago I received the Compaq laptop, at the time it was my first laptop, so I worked hard to make it useable again. I’m proud to say that it works well for light internet use and simple games, it even has a working wireless NIC.

As for my thoughts on the Mac vs. PC debate. Well, I find that it’s not much of a debate. Instead you have a majority of people who simply don’t care and a tiny minority of geeks who are passionate about one system or another to a religious extent. Very little debate happens due to this, instead each side focuses on circumstantial issues, biased opinions, and stereotypical members of the other camp. While this is great for strengthening the resolve of the group, it’s terrible at exposing the true strengths and weaknesses of each operating system.
In my opinion, the market leader (not to be confused with the sales leader) changes every few years. It’s about to change back to Microsoft, after Apple has enjoyed several years of superiority with OS X. The problems with Windows over the last several years have been security, polish, and a fear of breaking backward compatibility. Apples issues have more to do with their longstanding inability to attract corporate users [builds familiarity] and software vendors [more tools to get things done] and cost of entry.

Microsoft made a great stride in addressing their issues with Vista, they came close to fixing some of the worst security problems. Unfortunately, Vista is bloated due to the backward compatibility, and it is severely lacking in polish. [For a great breakdown on the polish issue search for “Joel Spolsky Yale talk” on Google.] After some time using Windows 7 it is clear that Microsoft has further refined their security, nailed the polish, and it seems that their implementation of backwards compatibility was taken right from the OS X playbook.

Meanwhile, Apple has mainly rested on their laurels with their operating system. The jump from OS 9 to OS X was huge, and for good reason: OS 9 was terribly outdated and only the staunchest Mac users remained. Since, they’ve further polished the system, and I can say that Snow Leopard has great usability from experience. The only issue that they’ve addressed at all in the time has been entry price, you can get a computer similar to mine for about $900. I did find that there are plenty of software vendors for the Mac world, I only ever need to use a Windows desktop if a site require Internet Explorer or to verify that my Pages document is formatted correctly to display in Word, but I know that plenty of people out there require software that you cannot find for Mac. Similarly, I’ve seen almost no increase in consideration for Mac users in the corporate world. Firefox has done much on the Web to expose the need for platform independence, but little else has changed.

Last year when I bought my MacBook I did so because I was fully aware of the issues with Vista. I did not want to buy a Vista laptop. I knew the Vista Capable debacle. [Though I don’t know what happened to the lawsuit that it caused.] When my Inspiron 6000 died I knew I would have to either buy a Mac or a PC with Vista, and at the time a PC with similar specifications was no cheaper than the MacBook. Ironically, Vista’s issues and the success of netbooks have pushed the PC manufacturers to sell respectable machines for far lower prices. Right now the PC truly is the better deal.

Windows 7 will re-energize Microsoft’s slumping sales. If we can assume that the price of a new PC will remain somewhat flat, or only rise a small amount, then I think they will fly off the shelves. People will be happy with them, and the bleeding in the laptop segment will stop for Microsoft. The debate will still go on, but it’s clear that competition is a good thing.

Tuesday, November 10, 2009

Favorite Websites

This is the first post in a series of reprints from my classes. Once the class is over, I will lose these if I don't save them elsewhere. I've decided to post them here as they may be of some interest. This is from my Introduction to Information Systems class, which I was too lazy to test out of.

Like the vast majority of Internet users, I get my search results from Google. I avoid Live.com like the plague. Interestingly, I recently took a blind comparison between the three major English language search engines and found that I preferred Yahoo! slightly over Google. That is not enough for me to change the default search on my phone and many computers.

I use GMail for almost all of my email needs. When I was given I GMail invite long ago I admit that I was skeptical. Ultimately, I think that GMail’s concepts of email conversations and labels were revolutionary. I know they invented neither but their implementation is top notch. I can hardly wait for Google Wave.

Facebook is the unquestionable king of social networking. No site on the Internet is better at helping you find and stay loosely connected to a group of people. Their suggestion data mining is so good it’s a little scary.

Netflix is my favorite site, and my top pick for entertainment. I've been using Netflix for seven years. In that time I've seen the site grow from a simple rental-by-mail service to a community of movie fans. This site has the best selection of streaming content on the Internet, though Hulu is closing fast. I'm also a fan of Bill Scott, the director of UI engineering for the company. I've rated over 1400 movies, according to my Netflix profile, and roughly 500 of them were rented or streamed from the company.

Honorable mentions include: SlickDeals.Net for bargain hunting; Lifehacker for, well... "lifehacking;" and Wondermark.com for humor.

Tuesday, June 9, 2009

Hulu: The Best Streaming Video Site?

Hulu is the undisputed champion of free streaming music video sites. What if we remove the word 'free'? Is it still the best?

I watch a couple shows a week on it. I watched even more before but I eventually ran through the interesting parts of their anime library. I've also tried to limit the time I spend watching TV, even if the computer is functioning as my TV.

Hulu has a great selection of content. Their strength is the same sort of content you would find on both major and cable networks. The television-sourced content is top notch, but their movie selection is mostly tired and old. As I mentioned, they have a respectable selection of anime, mostly series but a few movies as well.

The video quality is pretty great, if you have the hardware for it. Much of the content is available in two quality settings, 360p and 480p. Some is only available in the lower quality 360p. If the original was wide screen then so is the Hulu version. [At least as far as I can tell.] The 360p version is watchable and most likely to run smoothly, the high resolution content is better than standard definition television but not as good as 720p HDTV. [The numbers really give it away.]

The bigger difference between the two formats is in the audio quality. The low resolution audio is terrible, it sounds flat and tinny. It's compressed so badly that quieter sounds are sometimes lost entirely. The high resolution sound is good, if not great. The story is all about the low resolution sound which is unlistenable.

The problem here is that the only way to get the high quality sound is with the high quality video. If your hardware, particularly your processor, isn't up to the task then you're stuck either watching a smooth, full screen 360p video with atrocious sound, a choppy, full screen 480p video with great sound, or a smooth, windowed 480p video with great sound. I'll return to this, but needless to say it's not the best list of options to choose from.

Moving on to the interface, Hulu's site is fairly easy to use. I don't find their search particularly useful but most of the categories are organized well enough that it doesn't matter. The default sort of each category is popularity of video. You can also sort by air date, date added, and user rating. The same sorting options exist for shows instead of videos. I prefer to view by show, as I'm normally seeking a particular series or episode of a series. Each show has a page that lists all of the videos, with indications of when they were added and whether it's a clip or an entire episode. For my purposes most clips are rubbish, though The Office often has good clips of original "webisode" content.

The player interface is the best among the free sites. This is especially true if you make an account, where you can set your preferences to default to the higher quality video. If you don't have an account or change this setting then you will always have to click the 480p button to get to the higher quality stream. You cannot make this change while a commercial is playing.

When the player is not in full screen mode there are a few other interesting buttons on the right side of the video. If you're happy to watch the video in place you can click the "lower lights" button that overlays a translucent black layer on top of the page, but does nothing to dim the rest of your screen. If you can't watch in full screen but you want to resize the video you can click "pop out" which will put a very similar interface on screen in a window with no other content in it. Lastly, there's the full screen button.

The full screen mode is decent, but conspicuously missing is the quality setting. If you enter full screen only to realize that you forgot to go to the higher quality video you must exit full screen to change quality. The same is true if you enter full screen but find the video to be choppy and want to watch at 360p. I'm not sure why this is so but it proves to be an annoyance. You can stop and start the video with the space bar, and escape will exit full screen. I don't know if there are any other keyboard controls, but it's better than nothing.

Playback is fairly simple and intuitive. There is a progress bar at the bottom of the video that disappears after a few seconds, except in the pop out window where a small progress bar is shown the entire time. It has dots on it that show when the commercials are. You can skip ahead or go back. There's some algorithm that tries to force a commercial if you skip past one, and you only have to sit through one even if you skip past two. In the lower left is a play/pause button. That's it.

Of course, this is an ad supported service. With a few exceptions every video you watch will have commercial breaks. You can pause the commercials but you cannot fast forward. Even if the commercial doesn't load it will make you sit through a 15 second notice that you're being a bad citizen. Occasionally you will be offered an alternative commercial scheme where you can watch a two minute ad then the entire video will be commercial free. Normally I take that offer, especially if it's the cool Honda ad. I'm not a Honda guy, nor a big Danica Patrick fan, but that ad is good. In general I find the ads on Hulu to be far more tolerable than the ones on television. The breaks are shorter and the mixing isn't so ridiculously loud. They're also often real ads, not the self-serving drivel like on ABC.com. Lastly, the ads played in line with the rest of the stream, so you don't have to click to continue. Overall it's a very television-like experience, but more pleasant because the ads are fewer and higher quality.

That brings me back to the original question: Is Hulu the best? My verdict is no. Hulu's service is limited to only a few devices and their media center capabilities are wanting. The well-documented fight they've had with Boxee hasn't helped. On my Macbook running a 480p video at full screen pushes the 2GHz Core 2 Duo processor to its limits, and at that it drops frames. Flash video doesn't seem to offload much if any of the rendering to the GPU, keeping it all on the processor. This is unacceptable when you pair it with the poor audio in the lower quality stream. The commercials are tolerable, if that were the only fault I might declare Hulu the winner on the strength of their catalog.

There is a respite for Hulu, though. They recently released a media-center friendly desktop application. The content navigation in this app mirrors the website -- though it's a little clunky, especially so with a remote. The playback is better, adding some fast forwarding capabilities and showing a scene preview if you use the progress bar to skip around. The scene preview is a little slow. The real killer feature is the "medium" video quality setting. It seems to play a little smoother than the 480p stream from in-browser and the sound quality seems better than 360p.

The short take is that Hulu is a great DVR alternative with a good selection, but they take second place in the online streaming contest.

Monday, April 20, 2009

New Rules for Netflix Ratings

I am a big fan of Netflix. I put some thought into how I use the service in order to get the most for my money. I'm fairly happy with the results I get, but sometimes I have to tweak my usage to serve myself better. How I rate movies helps me remember how I felt about a movie and it helps the system suggest more movies, or predict how much I'll like a movie. After a few years of rating movies one way I have decided to change.

The Old Way

My old system for rating a movie was to try to rate it as objectively as possible. I focussed heavily on the merit of the movie, acting, script, and direction. I would then combine that with my preferences and come up with a rating. This introduced some personal bias, but I think most of the ratings were pretty fair. The exceptions were a few movies that I either loved greatly or hated completely, at which point I would typically let my emotions get the better of my objectivity and rate generally loved movies poorly or generally disliked movies highly.

The problem with this is that I was trying to be objective and not allow my bias to influence the ratings too greatly. This would be great if I were the only one reviewing these movies, or if the rating data wasn't being used for other purposes. Neither of those conditions are true, though. In short, I was being unfair to myself out of some sort of misguided attempt at journalistic integrity, even though I'm no journalist.

Other oddities happened because of this as well. I stopped trusting my own ratings. When someone asks me what I thought of a movie I will look that movie up on Netflix and use that rating to stir up the long term memories associated with that movie. It works great because I have the movie box, description, and my rating all there on one screen. I found that, increasingly as of late, I was having to mentally adjust my ratings based on whether I thought they were skewed for objectivity when I made them.

The New Rules
With my new system I will not change my baseline ratings. Instead, I will allow my bias to more significantly influence my ratings. After I have my final number doing this I will review it to make sure it accurately reflects how I interpret and feel about a movie. Then I'll click the little star that matches.

Basically, everything starts out the same as above. I get a rating number by thinking about how well made the movie was and whether it's worth watching. Then I allow myself to modify that rating by zero or more stars depending on how I felt about the movie and how strongly I felt it. If I have no strong emotions either way then a three star movie will remain at that rating. If I enjoyed that movie a good bit, I will probably add a star. I may add two stars in some circumstances. I doubt I would ever feel the need to add three. The opposite is true if I genuinely disliked a movie.

A few examples:
I recently rented The Prestige. It was a decent movie that mixes science, fake magic, and real magic. I thought it was beautifully shot and decently acted. It was an okay script. Objectively, I think I would give it four out of five stars. Once I added more of my personal bias into it I reduced it to three stars because I didn't like some of the treatment it gives to science, it was a little over-the-top, and it has an fairly obvious plot twist that seems to be there only for plot-twist addicts.

I also recently saw the import So Close. This is something like a Charlie's Angels flick set in Hong Kong starring the locals. It wreaks of bad acting, it's completely over the top, and it's cheesy as anything. The action scenes are top notch, though. If you enjoyed the Charlie's Angels series and like Jackie Chan movies then you may enjoy this. I objectively gave it two stars out of five. I think in the grand scheme of things that movies like this are largely trash. They are, however, trash I tend to enjoy. I liked the car chases and the Asian culture infused in this. So I bumped the rating up to three stars.

As you can see, two movies on the opposite ends of the quality spectrum now have the same rating. I'm able to be both intellectually and emotionally honest.

Other Rules
I did pluralize the word 'rule' for a reason. I have changed the way I think of a few things related to rating movies. I will no longer rate movies 'Not Interested' unless I have a very good reason. I have re-assessed my category ratings using the new Taste Preferences with a particular focus on emotional honesty.

For 'Not Interested', right now I'm reserving it for series items where I've seen parts of the series, but not all, and I am completely uninterested in watching any more. This means there are only 3 items with this rating so far: Dragon Ball Z, Home Movies, and Survivor Season 1. The first two are cartoons that I don't like, yet they are suggested because I apparently differ from the normal person who watches anime and adult oriented cartoons. The last is just weird. I don't know if the system suggested this for me or not, but I'll leave it there so that it won't suggest any "reality" shows.

I didn't like the effect that too many 'Not Interested' selections had on my suggestions and other ratings. I also don't like that it inflated some of my ratings counts. I've seen enough movies without the ones I haven't seen being counted.

My category ratings were a mixed bag of intellectual ratings, emotional ratings, and shame. Some categories I rated higher not because I like watching those movies, but because the movies themselves tend to be well made. That's great, until you realize that you aren't interested 15 minutes in but watch the whole thing anyway. The emotional ratings are probably the right ones, at least that's my take. Some of the ratings were born of shame, though. I was ashamed that I like anime, seeing that as the last step into hopeless geekdom. Finally, I realized that these ratings were entirely for me to help Netflix know what kind of movies I might enjoy. I'll eventually betray the same information by what I rent and how I rate it, so I should be honest to myself and rate categories as I think I would actually want to watch the movies in them. The good thing about Taste Preferences is that it presents the data in a way that makes this easier to swallow by asking you how often you want to watch such movies instead of forcing you to rate them on a five star scale. My only gripe is that I wish the ratings were more granular instead of never, sometimes, often.

That's a lot of thought put into rating movies. The good news is that I mull over these decisions for so long each time I rent. Rating a movie takes a second or two. I'm just trying to maximize my results.

Saturday, February 21, 2009

My RSS Reader has a "Productivity" Category

I put it there. Nevertheless, it's there. This post is about the irony of that.

Recently, one of the feeds that I have categorized under the Productivity folder, Lifehacker, posted about ManicTime. ManicTime is an application that somewhat unobtrusively monitors your application usage and provides a report. You can tag time spans and use it to help figure out how you're using your time. It's great for someone like me that has to fill out a detaile time sheet at the end of the week.

Here's where the irony comes in. I installed this app on Tuesday. Since then, only one day (the first day) was Firefox not the top application by sheer volume of usage. I used Firefox for almost 30% of the time I was on my computer. The most time consuming thing I do in Firefox? Read articles fed to me via my RSS reader.

Fortunately I don't live in a fantasy land where average people are 100% attentive. If I did then I would think that something was seriously wrong with me. However, I do live in the real world and I think that 30% might be a little high. Sure, I do read trade-specific articles part of the time. I do have a business reason to have Firefox. Still, the primary reasons I go there are personal.

Now I have to figure out what to do, or if I should do anything. I'm not too worried about my productivity. I even amaze myself with my ability to meet or beat deadlines occasionally. I do wonder if lowering the noise might boost my productivity, at least in a way that would result in less overtime and more family time. Then again, if I take away my distractions during the day I may realize how boring and tedious my job is. I may stagnate and stifle my creativity. What to do? What to do?


I'm going to try to cut back. I think I need to push myself to improve this ratio. I at least owe it to myself to experiment and see if an extra 10% of my attention is worth the price.

Thursday, January 8, 2009

Canceling Cable

The special package price on my cable service expired in November. Suddenly I'm faced with a $165 per month cable bill. That includes Internet and phone service, but it's still a big bill for entertainment and communication. Especially when you consider that I'm also paying around $100 for a few cell phones and $15 for NetFlix.

My first order of business
is to cancel the phone service they offer. After the introductory period it now costs $35 per month. I only take a very few inbound calls using that phone. I'm going to go with BroadVoice's In State Unlimited plan, which is ~$15 per month after fees and taxes.

Next up is the television service. This one is interesting. I like TV, and I think it is possible to consume it without becoming a zombie. It's just not worth the $100 per month that I'm paying for basic ($55 or so), digit/HD ($10), and two DVRs ($35). Don't get me wrong, I think I have a great setup and as far as cable goes I think Cablevision does pretty well. It's just that I don't want to pay $1200 a year to watch it, not anymore at least.

As for the Internet service, I will keep it. It's $45 per month if you have their cable service, and it will be $50 per month when I get rid of it. Cablevision's Internet service is very good, and FiOS isn't available in my area so there is no competition for it here. I'll need it for the VOIP service anyway.

In case you've been keeping a tally, $35+$100+$45 != $165. I know. I get a whopping $15 discount for having all three services. Good thing, because $180 per month would be intolerable. Glad I don't have HBO.

Next up, I have to figure out what to do instead. I don't want to miss out on sports and I do enjoy a few sitcoms. My son watches a few cartoons, most of which we record on the DVR so he can watch them according to his schedule. My wife occasionally watches the news and the weather channel. Otherwise, our television viewership consists of random stints, almost entirely by me. I still want to have access to some of that, and I've come up with a bit of a plan.

I can sum it up in three words: Antenna, Netflix, Hulu. There are other aspects to this but those are my main weapons. Other factors include purchased DVDs, video games, and probably turning the TV off.

Antenna - I have to purchase an antenna for my HDTV. I've yet to do this, and it may be the most crucial element here. If the antenna doesn't work well then I will miss out on sports, something I really enjoy watching. Even with the antenna I will be forgoing ESPN, I'd rather not lose everything else. The antenna will also provide access to local news and primetime television.

Of course, I haven't purchased an antenna yet. I don't know how well it will work. I'll have to mess with it and I may have to make multiple purchases before I'm satisfied. That's fine, because I live in a major television market so I'm confident I will be able to find something that works.

Netflix - I already have Netflix, and most of the movies I watch are Netflix rentals. I'm a fan of the service. Right now I have the two out unlimited service. Under my new strategy I will return to at least the three out unlimited plan, but I may increase that. I plan to try to take advantage of their streaming content, a library that seems to grow constantly. It will be a great way to spend a boring night if I don't have a movie at home or I'm not in the mood to watch any that I have. This leads me to the third part of my plan...

Hulu - Have you tried this service yet? They offer DVD quality streaming content and they have a wide variety ranging from obscure crap to some of the most popular shows on TV. Hulu has commercials, but they seem to have less than TV and so far they aren't mixed such that they make you pee your pants when they come on. Obviously this isn't the only service to offer streaming video, but right now they're the best option to directly replace television viewing. Veoh, Netflix, and in some cases the show's website offer similar streaming content. Another option is iTunes, where I may be able to purchase episodes of shows. So far there are no current episodes of any show I like available there, but who knows what the future might bring.

There you have it, my three pronged attack at the rising cable bill. I anticipate that this will save me at least $100 per month. I will save part of that money toward an entertainment budget that I will use to buy electronics and content. I hope to buy or build a media center PC sometime later this year so that I can record off air content and watch digital content on my TV.

Will this be as convenient as cable? Of course not. There is a reason why cable costs money and people pay, because it's easier than doing something like this. I won't get as much content, either. With my current package I get around 300 channels. Many of them broadcast 24 hours a day. Having a DVR attached to this opens the door to countless hours of content, more than I can consume. It's like the Golden Corral of entertainment.

Instead, I will use my entertainment budget to target my viewership. I will focus on things that I am more likely to enjoy, instead of whatever the networks decide to put in my viewing window. It will take more work to find these things, and sometimes I will have to pay for them, but I will be saving far too much to mind. Besides, I'm sure a little less consumption is a good thing.

Friday, December 12, 2008

Windows Vista: The Multilingual Must Pay

I have a Colombian friend for whom I occasionally perform some basic computer maintenance. She, and her whole family, are relatively dumbfounded by some of the chores required to keep their systems working in a usable fashion. They rely on me perhaps more than I would like, but they're good people and these are the things friends do.

Last week her mother bought a laptop. As I write this she and her new computer are flying to Ecuador. Since I was unavailable last week, I had to do some last minute work yesterday. They needed anti-virus software, the kind that doesn't require ridiculous yearly fees, and to have the language switched to Spanish. The first issue was a five minute ordeal to download and install Avast!, my current pick for AV. The second issue took two hours and required a hack.

You read that right, you cannot switch an English version of Vista Home edition to another language without using a hack. At first I thought it was annoyingly difficult, but when I found out that it was impossible without paying to upgrade to Ultimate edition I was floored. I'm not the only one, check out the anger and confusion at Microsoft's TechNet Forums over this issue.

The solution, as I mentioned, is a hack. Vistalizator, though it has a ridiculous name, was able to change the language in a few minutes. After that, I could barely work with the context menus. Since much of the software was developed in English it turned the laptop into a Spanglish mess. Something tells me that is a perfect result.

Friday, July 25, 2008

Finally Making a Move from Microsoft

Years ago I attempted to shift my home from Windows to Linux (GNU/Linux with KDE, GNOME, and a few other window managers, for the sticklers out there). It was a terrific success and then a huge failure. For around six months Windows was almost never used at home. Then my wife started school and a week into that it wasn't worth the struggle to get things to work with her school.

Her school was generally uncaring about what she did or didn't have available to her, for them she had to have a Windows OS and Microsoft Office. They wouldn't accept PDFs, her assignments involved creating documents that use features RTF doesn't support, and at the time OpenOffice.org still had issues saving Word documents. So, even though I managed to trick the school's website into unblocking Konquerer, she eventually had to use Windows.

Windows is a drug. All it takes is one use and you're hooked. Ignore the side effects, the constantly degrading performance, the nearly mandatory reinstall every few years, the need for all sorts of protective [and resource draining] software, it's still easier than fighting the tide. Roll with it, and you become addicted to the ease of communicating with others who are hopelessly addicted to Microsoft's proprietary formats. You can move on with your life and forget about the computer.

Along came Vista, and things started to change. The staunchest Microsoft supporters can only give a meek yelp of defense for Vista. It is becoming a Windows Me 2, and everyone is avoiding it. As more people move to that OS I hear more chatter about this or that device becoming a useless brick. Now that you must jump through hoops to buy a new system with XP, more people are becoming sympathetic with those who switch from Microsoft.

The resurgence of Apple is the tipping point. Apple is becoming less of a niche market every day. It helps that Apple kept their price points up so Macintosh owners are now associated with money. America respects money, so America has started to respect Apple again. With respect for Apple comes the understanding that interoperability [the real definition, not Microsoft's distortion] is important.

To give due respect, FireFox plays a big part in this as well. FireFox is largely responsible for the browser market opening up again. Now that Internet Explorer is relegated to a mere two thirds of the market fewer businesses are ignoring the other browsers. This means there is far less of a chance that a website will required IE for basic use. Although I still encounter a lot of small organizations who are behind the times and want IE for important features to work, sometimes inexcusable features like rendering or navigation. It is far better than it was, which is important because the World Wide Web becomes more useful every day.

When a shock to my Inspiron notebook bricked it this weekend I had a choice. I could continue with my addiction to Microsoft, or I could pay a little more and make a big step toward freedom. I took the step and I bought a MacBook.