HOWTO: Convert Video For Use on Your BlackBerry Bold

BlackBerry Bold 9000I’ve been toying around with my BlackBerry Bold a bit more since I’ve become more and more reliant upon it for my day-to-day duties (GTD, email, Reuters News, etc.). I migrated my life away from my aging Palm Treo 680 device to this BlackBerry several months ago, and I can’t imagine ever going back.

There are a few things I miss (DayNotez from Natara being the biggest one), but the raw capability of the Bold outweighs that small gap in function. The TRUE multi-threaded support, the ability to take a call while synchronizing, while streaming music in the background all on the same device, is outstanding.

The more I find myself using my Bold, the more I find myself wanting to use it more… so I started thinking about how I could start putting some full-length DVD movies on the 16GB microSD card so I could wile away the time while I work. The audio quality and the output are absolutely amazing, once you get the conversion right.

At first, I was looking for a standard Windows-style app to convert the video from my physical DVD to a format suitable for playing on the Bold. Bzzt! I tried several and the best I found was a project called Videora. Ultimately I found it to be clunky and amateurish. It also takes several hours to convert each video; too slow for my needs.

Basically nothing commercial I found out there for Windows was up to the task. None of the tools got it right.

So then I went to the old standby, free tools and Linux. I’ve used mencoder and ffmpeg before to convert YouTube video for your iPod, so this should have been very similar.

It wasn’t.

Fishing around, I stumbled upon this useful page of “19 ffmpeg commands for all needs” by Jean-Baptiste Jung. It goes through quite a bit of detail with commands for converting video and audio to all sorts of formats and devices. It is very detailed, but still lacks enough power and flexibility for my needs.

Not to be satisfied with that approach, I decided to keep looking, and I finally found “EncodeHD“; a Windows tool that is based on ffmpeg and other OSS components under the hood. As I type this, I’m converting several gigabytes of data to a format suitable for my BlackBerry Bold, and will give it a test shortly.

[…time passes…]

Here are some screenshots of the results! (taken with BBSAK, which I’ll write up on in a follow-up post. BBSAK is an amazing tool, much, MUCH better than bbscreenshooter)

Kung-Fu Panda on the BlackBerry Bold

Kung-fu Panda running on my BlackBerry Bold

Several years ago, I converted the entire full-length, extended DVD collection of Lord of the Rings to my iPod using ffmpeg, so I could watch it during the 17+ hour flight from CT to Australia.

Ah, those were good times.

Testing the Speed of BlackBerry Tethering Against My Own Networks

Tags: ,

I’ve been a long-time Cingular customer with my phones, and when they converted to AT&T, everything got wildly complex.

My normal monthly phone bill is hovering between $225.00 and $250.00 each month (yes, really… see below):

November 2009 AT&T Statement

This bill consists of my handheld (BlackBerry Bold, $99/month unlimited data + voice + text) + SIM card (inside my laptop, $59/month unlimited data). Since this is effectively two SIM cards, it counts as two separate “phones”.

Oddly though, the one inside the laptop still gets the 9-1-1 surcharge, even though there’s no way I could “dial 9-1-1” from the laptop. If I’m in any sort of emergency situation, the last thing I’m going to do, is fire up the laptop, connect to the LAN, launch Skype and call 9-1-1 from there. But they still charge me $0.35/month for that “privilege”.

I use the laptop while traveling on the train to the office, but when I suspend the laptop and resume it, the Linux “sierra” driver does not wake the SIM card back up. There is no known fix, and I tend to have to close out all of my apps, suspend my VMware sessions, power off and reboot to wake the SIM card back up. Not fun.

I wanted to try to reduce the bill, and spent about 2 hours on the phone today with a lovely woman “Sue” from AT&T to try to discuss my possible options. There are a few, but all have downsides (reduced cost, but reduced minutes or increased minutes, but lose my 20% company discount and so on).

So I’ve been testing tethering my BlackBerry to my Linux laptop, using any number of tools (wvdial, XmBlackBerry, Berry4all).

This does work, if you configure it properly. I ran into lots of trouble with it originally, because /etc/ppp/options had some conflicting options that my hand-written, optimized “blackberry” chatscript didn’t work well with. Once I figured that out, it latched right up immediately.

Writing data size: 4
	Modem -> [0x41 0x54 0x48 0xd ] [ATH.]
Waiting for PPPD shutdown to complete.
Hangup (SIGHUP)
Connect time 3.2 minutes.
Sent 200888 bytes, received 1852094 bytes.
Script /etc/ppp/ip-down started (pid 9381)
sent [LCP TermReq id=0x3 "User request"]
Script /etc/ppp/ip-down finished (pid 9381), status = 0x0
Network stats thread completed.
sent [LCP TermReq id=0x4 "User request"]
Connection terminated.
Modem hangup
PPPD finished

At this point, I could use my BlackBerry as a modem for my laptop and get around the suspend/resume bug with the Linux Sierra driver, but that comes at a price (literally and figuratively).

My laptop’s “phone contract” doesn’t expire until March 2010, and I pay about $53/month for that, and the early termination fee is $175.00. I could cancel that now, and save $90.00, but then I’d have to pay $60.00 for the “unlimited data + tethering” package. I already have a $30 “unlimited data” package on my BlackBerry, and so that would be a net add of $30.00 to the existing $99.00 plan already on there.

But what exactly is the “tethering package” that AT&T offering, doing for me? What am I paying $60/month for? I can tether today. It works. I can continue to consume the data on the data plan side of things, so why pay for tethering?

Basically I’d be saving $23.00 over the cost of a $225+ phone bill, after paying $175 to get out of the existing contract. You can see where this is going.. because AT&T has put some serious mathematicians behind figuring this out, so they can extract every single nickel and dime from your personal monies to pad their coffers.

I wanted to do some speed tests to see what the actual performance gain/loss would be across my local WiFi segment, my laptop’s onboard AT&T card (using the aforementioned “Sierra” driver), and the BlackBerry tethered to the laptop using ppp to “dial out” to the Internet.

Here are the results:

Speed using my home WiFi Connection:


Fast, strong, solid speeds. No complaints, and I use it every day to push gigabytes of data around.

Speedtest using my home WiFi connection

Speed using my BlackBerry tethered to my laptop:


As you can see, the speed is disgustingly slow here. Absolutely useless for anything more than telnet or ssh, and barely even good enough for that.

Speedtest with the BlackBerry tethered to the laptop

Speed using the onboard AT&T SIM card inside my laptop:


The speed here is reasonable, and acceptable for working on the train.

Speedtest using AT&T GPRS on the laptop

I just can’t continue to stomach the costs of the whole set of services anymore. $200+/month for a standard phone bill with no overage charges is ridiculous.

From what “Sue” at AT&T told me, everyone who uses a BlackBerry or an iPhone with an unlimited data plan, pays roughly the same amount. I’m skeptical.

I’m going to play with the tethering/ppp options to see if I can’t get some more performance out of it, and roll through to March and discontinue the service that my laptop is currently consuming.

The real question though, is how can AT&T tell if I’m not just using my BlackBerry to stream a ton of data (like through Pandora for BlackBerry), or if I’m tethering?

Why “Cloud Sync” Will Never Work

Tags:

cloud sync will never work
There’s been a lot of talk lately about “the cloud”. We’ve done the cloud before. First we called it “clusters”. Then we called it “grid”. Now we call it “the cloud”. I’m not sure what term marketing will need to call it in a year or two, but one thing is for sure: “cloud sync” is doomed to fail, before it even gets started.

“Cloud sync” is the term used to describe sending your data from various end-user devices (“clients”, usually handheld devices or desktop PIM apps) to the cloud; usually a central server somewhere. Some popular examples of this are:

…and literally dozens of others.

Of these, Funambol, ScheduleWorld and Google Sync are among the most mature, and also among the most problematic (they are based on SyncML).

And every single one of them is failing, because they’re all based on some very broken logic and monolithic designs:

  1. Not a single one of them supports synchronizing multiple calendars or other data sources while keeping them separate in the cloud’s datastore itself. Without the ability to sync multiple data sources under the same user, the cloud becomes completely useless.
  2. Not a single one of them understands how to get sync right, without corrupting, duplicating or deleting data. Every single one of them has an issue here.
  3. None of them are prepared for handling a device they’ve never seen before, or a device which has records that include a field or format they don’t already know about. Everything has to be pre-determined, pre-defined, and that will never scale.

I’ll give you one example, one I’m fighting with every single day. I’ve run this same (very structured) scenario through about two-dozen vendor, free and commercial projects and products, and every single one of them breaks down and fails in one way or another.

  1. Back up your device, making sure you have a complete copy of the data, in case anything goes wrong (and it will).
  2. Sync your client device to its native software. In the case of Palm, you’d sync that to Synergy. In the case of Blackberry, you’d sync that to BlackBerry Desktop Manager, and so on.
  3. Sync that same device to your favorite PIM package. If you’re using an iPhone, sync that to iCal or Microsoft Outlook. If you’re using Linux, sync your device to Kontact or Evolution, if you’re using a BlackBerry you’d sync that to Lotus Notes or Microsoft Outlook.
  4. Sync this same device to “the cloud” using whatever software is provided by the cloud project you’re synchronizing with. If you’re using ScheduleWorld, use their sync software. If you’re using another, use whatever software they provide.
  5. Now install the sync tool for your target “cloud” service in your PIM application. If you’re using Microsoft Outlook for example and Funambol, you’d install the “Funambol Outlook Plugin”. Likewise for whatever other PIM and service you choose.

At this point, your client device (your iPhone, BlackBerry, Palm or other handheld) and your PIM device (Microsoft Outlook, Lotus Notes, Evolution, Kontact, etc.) should all contain exactly the same data, and that same data should now be available “in the cloud”.

Now sync your handheld device to the cloud again, and watch what happens. In every single case, without fail, you will get a.) corruption, b.) lost data, or c.) duplicated data.

There have been numerous discussions on dozens of mailing lists about how to get sync to work correctly, and I can say from over a decade of in-depth personal experience with synchronization, that it is downright impossible to have this work correctly, without very aggressive, deep inspection of every single field of every single record, at each sync, once you add more than a 1:1 relationship. The “cloud” concept already implies a many:1 or many:many relationship.

Let me replay it to show what I mean:

  1. Sync your handheld device to your PIM (usually using a USB connection). Right now, your handheld and your PIM should contain exactly the same data.
  2. Sync handheld device to the cloud. This creates new records in the cloud which should match your PIM exactly. Now PIM, device and cloud should contain the same data.
  3. Sync your PIM to the cloud. Depending on the software used, you’ll trash data here. In the case of something like Funambol, you’ll take multiple data sources and merge them into one source in the PIM, creating a mess and lots of duplicates. Other software packages do similar things to the data.
  4. Sync your handheld device to the cloud again, which will now duplicate and transport that mangled data to the handheld.

There is absolutely no way for the cloud to know that your handheld doesn’t have identical data that the cloud or PIM has, so it must inspect every record at sync time. None of them do this.

You’ll probably hear the terms “slow sync” and “fast sync” in reference to these issues, and I can say with certainty that nobody is doing it 100% correctly.

“Fast sync” is a term used to describe synchronizing one device with one server. Pointers to the last changes on either end are captured, so when you sync again, it only has to send the updates and changes across. Palm handheld devices use a “LastSyncID” on the device to identify the client to the server-side. If the server sees a different incoming client connection, it knows that it isn’t the device it last talked to, so it initiates a “slow sync”.

“Slow sync” is a term used to describe a sync that happens when the server and the client device can’t determine if they’ve “spoken” before, so they send the entire contents of each datastore (calendar, contacts, tasks, memos, etc.) and then compare on a record-by-record basis, to see if there are any changes.

Once you sync more than one device to a server, there is no way you can maintain a “fast sync” relationship with each client.

A == handheld (iPhone, BlackBerry or other)
B == PIM application (Microsoft Outlook, Lotus Notes, Sunbird)
C == "cloud" server

A -> C
B -> C

There is no way for C to know that ‘B’ contains the same data as ‘A’, so it must inspect every record as it arrives, to consolidate the changes, if any are found.

The closest I’ve come to a “perfect solution” is using Intellisync’s backend behind BlackBerry Desktop Manager, but it requires a local (USB or Bluetooth) connection, and requires Microsoft Windows. This is an unacceptable solution to me.

When you sync your BlackBerry to BDM, it does a minimum of 4 separate passes across the data, to compare, inspect and merge any and all changes that may be necessary. If anything has changed or is in conflict, a dialog will be displayed that give you the ability to re-sync, cancel, accept, etc. the changes, as well as inspect them on a change-by-change, field-by-field basis. It does this every single time. There is no “fast sync” capability with BlackBerry Desktop Manager. It does not trust the incoming data by default, nor should it.

But using BDM, there is no way to accept some changes, reject others, merge yet others. It’s an all or nothing solution, and when you have 4,700+ Calendar events as I do for example and a sync process that takes over 20 minutes for each sync pass, it can be quite painful.

Until we develop a way to accept anything the client sends, store it on the server in a unique, per-device datastore, and then aggregate that data in an intelligent way back down to the other requesting clients, “cloud sync” is doomed to fail, before it even gets off the ground.

The current thinking is that we just push the data up to the server(s) (“the cloud”), and the pull it back down to other client or end-devices. That logic is broken, because it isn’t that simple.

If I have a device which supports photos in a Contact record, and I sync to my PIM which does not support photos in the Contact records, and then sync my device to the cloud, and my PIM to the cloud… what happens to the photos? I’ll tell you… they’re lost. Once I sync that PIM to the cloud, the photos are gone, and then when I sync my handheld to the cloud, those (photo-less) records in the cloud are sent to the handheld, trashing all of my local contact photos.

This is just one example, but it proves the point that we can’t continue storing data “in the cloud” in a single, large datastore. Each device is unique, and each device’s way of representing the data is unique. Without keeping that data separate, you’re going to trash, duplicate or lose data.

There are ways to fix this and plenty of bright people working on these problems, but most of the projects I’ve encountered are all so stuck in the dark ages, they refuse to think about ways to solve this that are 100% future-proof, and allow any device with any data representation to be able to cohabitate within the cloud with these other devices and services.

SOLVED: Missing Microsoft Office 2007 Shortcut Icons

About 2 weeks ago, I noticed that my Windows application shortcut icons were showing the default, “no type associated” icons for all of the Office-related documents. I could double-click one of these (such as a .pptx or .xls file) and it would open in the correct application, so the shortcut itself was working.. but the icons were the generic Windows icon:

Windows default shortcut icon

I found a very detailed page describing various ways to try to fix the issue, and I tried all of them. None of these worked.

What did work, however, was completely quirky and inexplicable. I had to replace a directory under C:\Windows\Installer called:

{90120000-0012-0000-0000-0000000FF1CE}

Here’s how I stumbled upon this. I tried to change the default icon for Excel files by doing the following:

  1. I opened an Explorer window (explorer.exe)
  2. I clicked on Tools -> Folder Options -> File Types
  3. I scrolled down to XLS in the list and clicked on “Advanced”. I saw the following dialog:
    Excel icon file type dialog
  4. I clicked on “Change Icon”, and the following error message came up:
    Windows installer xlsicons

So the icon shortcuts were missing, because this weird directory was missing (probably some disk-cleaning tool I ran purged that directory to regain some space).

I found that directory on one of my other Windows laptops, copied it over, and now the Office document shortcut icons are working again.

Windows installer icon directory

Whew, that was a rabbit hunt.

Still Searching for Calendaring Nirvana

Tags:

I’m still on a quest, a long and arduous quest to find the right series of tools and techniques that will let me get my Work (gnu-designs, inc. and $EMPL) and Personal calendars into a format that I can use, without losing appointments, data or context and manage them on my handheld device (currently a Blackberry Bold 9000).

I tried using Google Calendar Sync, and found the following outstanding issues with it:

  1. No support for multiple calendar files. This is a HUGE oversight, and unfortunately, most vendor products lack this capability, but I don’t think I know anyone who doesn’t have a separate Work and Personal calendar of some sort.
  2. Duplicating events. That’s just bad programming logic.
  3. Ignoring/missing events. See above, just poor logic flow in the sync design.

After getting frustrated with that, and trying various methods to try to work around these deficiencies, I was led over to “SyncMyCal“, which is a pretty slick solution overall. Except there’s no way to keep the data separate once it gets to Google Calendar. If you want to sync the data back from Google Calendar to Microsoft Outlook, it just gets glommed into the main (default, as defined by your configuration) Calendar file on your desktop.

Yecch.

Nokia Intellisync Logo

Ironically, the company with the absolute worst track record for managing PIM data (RIM, aka makers of Blackberry handheld devices), has a solution based upon Intellisync (now owned by Nokia).

Their sync solution supports multiple source calendars (Work + Personal), synchronizes them to a single handheld device, and keeps them separate, without intermingling them on the way back to the desktop when you sync again. This is the closest thing to calendaring nirvana that I’ve ever used. This isn’t without its own issues, however:

The major problems I’ve found so far are:

  1. Major, major data corruption about 75% of the time, when a sync hangs or fails outright. In order to get around this, you have to delete the Intellisync directory (C:\Documents and Settings\$USER\Application Data\Research In Motion\BlackBerry\Intellisync), and re-create your sync rules from scratch.
  2. The catch (and the second major issue)? You need to be connected to the live Internet in order to configure the sync of your Blackberry to Outlook, using Intellisync through Blackberry Desktop Manager. Why? WHY do I need to be connected to the Internet to create a local sync rule from Outlook <-> Blackberry and back? Nothing at all in that process touches the LAN network or the Internet. Something fishy about that.

I found an interesting thread that suggested using Mozilla Sunbird as a “trampoline”, and going from Outlook -> Sunbird -> Google Calendar -> Google Sync for Blackberry.

In that thread, user “ssitter” mentions FreeMiCal, which is a project aimed at creating proper, standards-compliant iCalendar files from Outlook as a data source, and using those .ics files as the import source into Sunbird. From there, you can then use the Sunbird Provider for Google Calendar mentioned in this other thread to get the calendar data into Google Calendar, in a two-way fashion.

Or so I thought… Here’s a quick snip of what Outlook sees in my calendars (4.074 events total):

Outlook work + personal calendars

And here’s what FreeMiCal sees when I launch it (402 events out of 4,074 total):

FreeMiCal Outlook

In other words, it too doesn’t see my other calendar (the one with 3,672 events in it). Sigh.

So I’m back to square-one again, trying to find a solution that allows me to keep my calendars separate in Outlook (or Sunbird or Evolution or whatever desktop-side PIM client I choose), and aggregate them on my handheld device, but then keep them separate when they get synchronized back.

Here’s what I’ve used and excluded thus far, because nothing fits what I’m looking for:

  1. Blackberry Desktop Manager
  2. SyncMyCal
  3. CompanionLink for Google
  4. FreeMiCal
  5. Google Sync for Outlook
  6. Google Sync for Blackberry
  7. Funambol

Still nothing, still searching, and any suggestions or ideas anyone has… I’m all ears and eyes.

Finally done with Xobni, for the 9th time (and for good)

Tags:

Xobni graph
After extensive testing of Xobni’s Outlook search tool through several versions and iterations (including being on their beta testing team), I’m finally giving it up, for good. It simply does not work for an Outlook user at the level where I use Outlook (hundreds of emails a day, hundreds of thousands of emails in the archives). I’ve posted about my Xobni frustrations before.

Let me explain:

I use Outlook primarily in two places:

  1. Work desktop machine running Outlook 2007, connected through Exchange. This machine is tightly controlled, monitored and locked-down. I can’t reconfigure or install anything within this Outlook instance at all.
  2. Personal VMware session running Outlook 2007, connected through work VPN to Exchange and locally to a .pst file which contains my Personal Calendar, Contacts, Tasks, Notes, etc.

When I’m working from the NYC office, I use Exchange through Outlook, with no add-ins, extensions or anything other than a locked-down instance of Outlook 2007 for managing email, calendaring, tasks, notes and contacts.

When I’m working from the CT office (the other 90% of the time), I use Outlook through VMware Workstation, connected to the work Exchange server over the VPN to retrieve and compose/respond to my work-related emails. I also have my work Calendar, Tasks and Notes available in the same instance. I have my Personal Calendar (in it’s own .pst file) overlaid on top of my work Calendar, so I can see my appointments in aggregate for both personal events as well as work meetings.

So far, this is a pretty normal usage scenario for any user of these kinds of tools.

When I install Xobni into Outlook, I make sure that the VPN is disconnected and that all mail in the cached .ost file was properly indexed, checked and verified to be intact and not damaged in any way. I also went into the Xobni settings and:

  1. …disabled all of the “phone home” features of the product
  2. …disabled the Facebook, LinkedIn and Skype integration. I don’t use those additional features of Xobni, and while connected to the VPN, they would simply fail anyway. Xobni doesn’t allow me to set a proxy server anywhere, and it can’t use the default one as configured in MSIE or the environment.
  3. …disabled any and all automatic indexing and background indexing

But when I log into the VPN, launch Outlook, and Xobni tries to start up, it will start up and then completely hang Outlook at random points during the function. When Xobni hangs Outlook, I have to hard power-off the VM and reboot. I can’t shut Outlook, and I can’t shut down the VM cleanly, because whatever Xobni does, hangs the entire machine itself. I suspect (and this is just a theory), that Xobni is trying to get outside to the live Internet, and blocks, hanging Xobni, which hangs Outlook, which hangs the underlying VPN, which hangs the network interface, which hangs Windows.

I’ve installed Outlook Shutdown for Outlook and KnockOut, in an attempt to get Outlook to close completely and fully when requested, but it still hangs. I can’t kill Outlook.EXE via Task Manager (Task Manager hangs too, and running any other Windows application also hangs, including Start -> Run -> CMD.EXE and opening Windows Explorer). It’s a huge mess when Xobni hangs everything up.

And what’s worse, is that when I have to hard-power off the Windows VM, it corrupts the open .pst and .ost files that Outlook uses, which necessitates that I run SCANPST.EXE and SCANOST.EXE on them to repair them.

Not fun.

But when I run Outlook with /noextensions /safe (other Outlook command line switches can be found here), all works fine and dandy. When I uninstall Xobni using Totall Uninstaller (great tool, highly recommended), and run Outlook in “normal” mode, all works fine and dandy.

I reported my issues in great detail, as well as the steps to reproduce them, on the Xobni Community Support site, and received thousands of post views, but not a single response to any of them. I just visited their support forum, and noticed that it’s been revamped with a whole new interface, and a whole new hosting provider (“Get Satisfaction”, an ironic name considering Xobni does nothing of the sort for their users).

But they’ve purged and deleted all previous threads, discussions, topics, and users from the system. So now my posts are gone, as well as my account, account history and any threads or comments I’ve created or responded to. Nowhere on the Xobni blog do they mention revamping their community site nor do they mention any sort of migration path from the old community site to the new one.

TrogBar logoStupid, stupid, stupid. That is the last thing you want to do to help germinate a community of enthusiastic users around your product is to purge them, boot them out and tell them to start over again from the beginning.

And so, I’m doing the same with Xobni now. Their product is buggy, hangs, crashes and corrupts Outlook and its data. Avoid it and look for other Outlook search solutions (X1 seems to be gaining in popularity these days).

Something else that I found while searching for Xobni alternatives, is Trog Bar, a very odd name for a very slick product. It’s a tool that docks on your desktop, and can ferret through all of your email, looking for “actionable” emails. It then uses something called “Task Sense” to determine the priority of those actionable items, and sorts them accordingly.

I’m still searching for a better way to search, sort, and analyze my emails in Outlook because there really is nothing else out there that works right.

Cleanly installing and running Adobe Air and TweetDeck on 64-bit Linux

Tags: , , ,

Adobe Air logoA lot of people have been trying to figure this out without much success, and because I refuse to just give up and quit, I finally did.

The installation seems to work fine on 32-bit Linux, but does not work at all on 64-bit Linux.

Here’s how to get Adobe Air installed on your machine, and then from there, get the applications to be installable via Firefox and the CLI, and have Adobe Air update itself to current, as needed… all on 64-bit Linux (Ubuntu in my case).

Read the rest of this entry »

The Myth about Monitor Refresh Rates and Fatigue

CRT vs. LCD
There seems to be some misconception around the issues related to eyestrain and fatigue when using computer monitors and staring at computer screens for great lengths of time. I spend no less than thousands of hours staring at computer screens a year at home, work and elsewhere.

My office desk has 4 monitors set up in a panorama format, I have 3 laptops at home that I use, and 2 large 26″ LCD montors in my home office, as well as my phone, PDA, PSP, and my daughter’s DVD player, Leapster 2 and other devices. I’m surrounded by screens and monitors all day long.

But these are not CRT monitors, and there’s a reason for that.

Back when CRT monitors were all there was, these monitors refreshed at a rate of roughly 60Hz, which… coincidentally, was also the same refresh rate of the standard fluorescent tube that are used in almost every office environment. To reduce eyestrain, you could:

  1. Buy a monitor with a higher refresh rate (or set your monitor for a higher rate through software), or…
  2. Switch to a non-CRT monitor such as an LCD monitor or a projector, or…
  3. Change the fluorescent ballasts you’re using to other lighting which do not happen to refresh at 60Hz such as one of the newer solid-state ballasts (which operate at 25,000Hz instead of 60Hz) or full-spectrum bulbs instead, or…
  4. Just ignore the problem and hope it goes away, along with your eyes and headaches.

There are quite a lot of negative side-effects of using the standard 60Hz fluorescent ballasts, such as:

  • The ballasts operate at 60Hz, or cycles per second – the same frequency of the AC voltage they run on. This means that each lamp switches on and off at 120 times per second, resulting in a barely perceptible flicker and a noticeable hum (sounding like a buzzing low ‘A’ note on a piano). About 25% of the population is sensitive to ballast flicker and hum and actually can become physically ill, with symptoms such as headaches, nausea, itching and burning eyes, tension, eye fatigue, and general fatigue.
  • Operating at 60Hz, they may cause a stroboscopic effect with any machinery which has parts, such as pulleys or gears, running at speeds that are a multiple of 60Hz. The stroboscopic effect will cause the machine to appear motionless, which could be a deadly hazard.
  • The most commonly used electro-magnetic ballast, the rapid-start type, draws 2-3 watts even if the lamp’s tubes have been removed, a practice often employed by businesses to reduce excessive light levels. This could be a sizable expense in a building with many lamps.
  • They give off excessive EMF (Electro-Magnetic Fields), considered a potential cancer-causing agent.
  • Any of the electro-magnetic ballasts produced prior to 1978 contain PCB’s – a known carcinogen.
  • Not energy-efficient, with a relatively short life span of about 10 years.
  • During the final 30% of their lifespan they consume the same amount of energy, while producing far lower light levels.

I switched over to full-spectrum bulbs and CFL bulbs a few years back, so I don’t see these issues, but I also no longer use a CRT monitor.

Here’s the confusing bit: LCD monitors don’t have the same refresh rate issues. People bring this up year after year, and it was raised on a mailing list I participate in, in the context of eyestrain with using ebook readers on mobile devices (Palm, Kindle, etc.). A lot of people on that list suggested that the person should change their refresh rate on their LCD, so it doesn’t cause eyestrain.

Let me restate: Changing refresh rates on LCD monitors does nothing to help or alleviate eyestrain. In fact, it does nothing at all.

The “flicker” that you get when using a standard CRT monitor + fluorescent bulbs is the a result of phosphor decay; that is, after the energy from the electron gun is transferred to the phosphor material, the energy and the resulting light begin to decay very slowly until the electron beam hits the phosphor again. …

From the IEEE Xplore page:

“…the decay-time constant of an image tube phosphor is a complex function of many variables: chemical, electrical, and mechanical. One variable often overlooked in image tube applications is that of excitation time. This letter presents the excitation-time/decay-time characteristics of some common phosphors and their application to some display problems.”

Since the standard LCD monitors we use today do not employ phosphors in their construction at all, the issue of “refresh rate” is completely irrelevant. The transistors used in the LCD remain open or closed as needed until the image changes.

This tends to be confusing because most graphics cards still “ask for” a refresh rate setting in their configuration. Windows still allows you to change the refresh rate of their graphics drivers, but this is entirely due to the analog nature of existing graphic cards and their legacy support for CRT displays as output devices. While refresh rates do not apply to LCD monitors, most LCDs are set up to accept any settings from 60Hz and above.

So if you’re concerned about eyestrain, headaches or other things and might believe it has to do with your LCD monitor, it probably doesn’t. To alleviate that strain, you could try any of these approaches:

  1. Put in full-spectrum bulbs to brighten up the lighting in your environment (less strain on your eyes to see in darker work environments)
  2. Use “task lighting” closer to your direct work surface
  3. Go outside and step away from your computer monitor for a few minutes every few hours. Stretch, walk around, look at things far and things near, and get your eyes away from being used to focusing at everything 18″-24″ away from you.
  4. Make sure your monitor is the correct height for your eyes to see it. Too far below, and you’ll hunch over. Too high up and you’ll strain your neck and eyes to see it. Proper workstation ergonomics are critical to reducing eye, neck, shoulder and back strain.

Using these ideas should reduce or remove any possible eyestrain, headaches and other pain you might feel when you’re spending hours and hours every day in front of a computer screen.

Snapshot backups of EVERYTHING using rsync (including Windows!)

Tags: , , , , , , , ,

Just a bunch of disksLet me just start by saying that I have a lot of data. In multiple places. Some on laptops, some on servers, some on removable drives and mirrored hard disks sitting in a bank vault (yes, really). Lots of data on lots of systems in different states and locations: client data, personal data, work data, community data and lots more.

Over the years, I’ve tried my best to unify where that data is sourced from, for example by relocating the standard “My Documents” location on all of my Windows machines (physical and virtual), to point to a Samba share that is served up by a GELI-encrypted volume on my FreeBSD or Linux servers. That part works well, so far, but that’s only a small piece of the larger puzzle.

Over the last decade, the amount of data I’m holding and responsible for managing has grown significantly, and I needed a better way to manage it all.

There are plenty of backup solutions for Linux including the popular Amanda and Bacula, but I needed something more portable, leaner and much more efficient. That quest led me to look to find Unison mostly due to it’s cross-platform support, but it was still a bit more complicated than I needed.

So I kept looking and eventually found rsnapshot, a Perl-based tool wrapped around the standard rsync utility written by Andrew Tridgell.

Since I’d already been using rsync quite a bit over the last 10 years or so to copy data around as I needed it and to perform nightly full backups of my remote servers, I decided to look into using rsync to manage a new backup solution based around incremental backups as well as full backups.

I’m already using rsync to pull a couple of terabytes of mirrored data to my servers on a nightly basis. I’m mirroring CPAN, FreeBSD, Project Gutenberg, Cygwin, Wikipedia and several other key projects, so this was a natural graft onto my existing environment.

Read the rest of this entry »

How to Survive What is Coming

Everyone is feeling the crunch of the financial market teardown, from jobs lost to increased expenses to losing your house and home to bankruptcy. This is just the beginning, unfortunately. As more and more jobs are lost, as more and more markets dry up, it will cascade and probably accelerate through various different markets. We haven’t seen the bottom yet, even though lots of “experts” claim we have.

I’m no expert in any of this, but I’ve lived on a thread for years before. I’ve lived through my employer’s downsizing and living out of boxes while I try to find a new, more-affordable place to live. I’ve struggled hard through very hard times, and I’ve always made it through, and I’ll do it again if I have to. You can too.

So here’s a few things that I’ve noticed and gathered, which may or may not help you with your own situation, whether it be technical or social or fundamental.

Read the rest of this entry »

Bad Behavior has blocked 869 access attempts in the last 7 days.