Search
Posts by Tag
Backup History
Visit our Archives Page.

Posts Tagged ‘Cloud Computing’

Is Cloud-Based Data Protection Really the End of Backups?

Thursday, March 17th, 2011

Joel Maki at Zetta has been sending me pitches for about a million years. The problem is, Zetta provides enterprise storage services, which isn’t want most of my readers are looking for.

Most recently, however, he sent a copy of the report Zetta VP Products Chris Schinn gave at the November 2010 Cloud Expo. There are some useful statistics about the growth of data and storage needs, incidence of data loss, and the like, that I thought readers might be interested in.

GDE Error: Unable to load profile settings

Zetta is pitching cloud-based sync and replicate as the wave of the future—with versioning to prevent replicating those dreadful “Oops!” moments when you manage to destroy the project you’ve spent weeks working on. This is not so different (in fact, you’d have to talk to representatives of the two companies to know precisely how different it actually is) from the Continuous Data Protection first  advertised by LiveVault in their Backup Trauma Institute video in 2005. LiveVault has since been acquired by Iron Mountain, which is also offering cloud-based backup.

Enterprise services like these definitely have their advantages over tape and disk-to-tape, but I have the same doubts I did in 2005, for the same reason: we have terrible broadband infrastructure in this country. Retrieving a few lost files, or searching them, is certainly going to be easier with cloud-based backups than with tapes. And you have protection against fire, theft, and natural disaster. But are you really going to be able to restore terabytes, perhaps even petabytes of data via a network connection?

Schinn’s report suggests that in the event of a disaster, a company can simply “fail over” to the cloud-based files instead of actually restoring them. This is an interesting proposition and would certainly allow employees whose office had, say, been flattened in an earthquake or washed out in a tsunami, to continue to work remotely from anywhere they could get a connection. (Possibly not so easy to do following said earthquake and tsunami, as we’ve recently discovered.) Mounting storage is a bit different from mirroring the web server and mail server, but not all companies keep those on site, anyway, so chances are decent those machines are in a secure data center at a different location.

There are types of data I don’t back up online for security reasons, yet I find transfer speeds and possible outages a much greater deterrent to converting all my backup to a service like Zetta’s. For one thing, it’s not very logical for me to be unwilling to back up my Quicken data online when I file my taxes online and do my banking and most of my shopping online, and for another, serious security breaches involving financial and other institutions almost always involve the physical theft of physical backup tapes.

But infrastructure bottlenecks are a real issue. In most parts of the United States, small businesses and consumers alike have one choice for cable Internet and one choice for DSL. According to NetIndex, California has an average consumer download speed of 10.91 Mbps, and an upload speed of 2.24 Mbps. (Speedtest.net gives me a download score of 16.67 Mbps and an upload score of 4.21 Mbps, a bit above average.) Compare that to the Netherlands, where the average download speed is 23.51 Mbps. The average. The Ur-guru has Internet in excess of 100 Mbps downstream.

If I follow the math correctly (see Wikipedia on why the math of bits and bytes is never simple), a megabit is 1/8 of a megabyte. So if you had an 8 Mbps connection, you could move one MB (megabyte) per second through it. My connection is about 16 Mbps, so I can download 2 MB per second.

I have about 274 GB of data and software on this computer right now. That’s 274,000 MB. If I had to download that over my current connection, it would take 38 hours and change. (That’s assuming I actually maintained that download speed, and let me tell you, nothing I download ever downloads that fast. Official download speeds and actual download speeds are not the same thing.) A day and a half for the contents of one laptop.

If I had to upload all that…well, I wouldn’t. This is why some cloud storage companies, including Amazon S3, give you the option to make your first backup by sending a physical drive. (It’s not clear from Zetta’s website whether they do something like this.)

So until we have considerably faster upload speeds available to us, I don’t think the enterprise has really come to the end of backups, even though we might be moving from a disk-to-disk-to-tape model to a disk-to-disk-to-cloud model. Which does, I have to admit, sound like an improvement.

Clouds on Gladinet’s Horizon

Sunday, May 9th, 2010

If I spend the first several paragraphs apologizing and making excuses for not posting a column since April 2nd (ouch!), I’ll just be adding insult to injury. The Ur-Guru said to blame him, but he’s only been here since April 24th, so that won’t work. I just got caught up in other things—and I only wish all of it had been high-paying client work, which is the kind of excuse I like to be able to make.

image

Anyway, here at last is the review of Gladinet Cloud Desktop that I promised Jerry Huang ages ago.

The interesting thing about Gladinet is that it lets you back up to multiple cloud storage sites simultaneously. It also maps those sites as “My Gladinet Drive” in Windows Explorer so you can drag and drop from them. Given the awkwardness of reaching some of these services through their own interfaces, that’s a considerable benefit right there.

Once you install the program, there are a couple of screens of settings to configure.

gladinet initial settings

First, enter your license key if you have one. A home-user license for Gladinet Desktop Professional is $39.99 and a commercial license is $59.99. FTC disclosure: Jerry gave me a license key so I could test all the program’s capabilities. Then register with Gladinet. (Give them your name and e-mail address.)

gladinet virtual drives

Next, add some storage. I was impressed at the number and variety of possible storage locations, some of which I hadn’t heard of, and some of which I hadn’t known you could use as storage locations. I initially checked Google Docs and Amazon S3, but later signed up for Azure Blob Storage from Microsoft to make it a better test. (And what a nuisance that was—far more trouble than signing up for an Amazon S3 account, let me tell you. And what kind of name is “Blob,” anyway?)

gladinet general settings

Once you’ve chosen your storage locations, Gladinet will show you your general information and give you the option to change settings such as the drive letter it maps to (I wasn’t using “Y” for anything else, so I left it), whether to encrypt your profile, and so on.

Gladinet mount virtual directory

Before you can use the storage options you checked off, you have to provide login credentials. This was not too tricky with Amazon S3, since I’d had to do it with several other programs already and knew where to find the information. It was also fairly simple with Google docs. It was notably confusing with Azure Blob and took several tries before I had the right information in the right place. That’s not Gladinet’s fault, mind you, but a certain lack of clarity on Microsoft’s part. Maybe if you’re a Microsoft developer you understand these things intuitively. If so, I don’t think the Azure Blob service is really meant for anyone else yet. But I digress.

If you use Skype, you might get an error message from Gladinet saying that Port 80 is blocked. Jerry says the easiest way to fix that is to go into your Skype options under “Connection” (in Advanced settings) and uncheck the box that says “Use port 80 and 443 alternatives for incoming connections.”

clip_image002

After setup is complete and you’ve mounted your virtual directories, you have several options. Gladinet installs a fairly sophisticated tool in your system tray/notification area/whatever they call it in Vista and Windows 7, and you can just right-click that to start the Gladinet Cloud Explorer, the Backup Manager, or the Task Manager—or to run backup tasks directly. You can start the Gladinet Management Console from the Start menu, as well, and the Gladinet Quick Launch screen will pop up when you boot your machine unless you do something to make it go away.

gladinet management tools

There are several options for backup with Gladinet. You can choose to back up all documents, pictures, “musics”, videos, folders, or select specific items. I wanted a relatively quick test, not an exhaustive hours-long marathon with my upstream connection speed as a bottleneck.

gladinet backup source

As you can see from the screenshot, Gladinet had no trouble seeing my network drives and considered all of them valid sources for backup, though it does warn that backups may not be real-time. Since I wasn’t planning to use it for continuous syncing, I wasn’t worried about that.

If you do choose to back up all your “musics” or videos or document, Gladinet will go through all your drives to index those files. That can be a time-consuming process and slow down your system, so it warns you about that.

In this case, I just opted to back up my FileSlinger™ newsletter directory to all three backup destinations: Amazon S3, Google Docs, and Azure Blob.

Gladinet Backup Multiple=

I got a prompt from Google Docs asking me whether I wanted to convert my Microsoft Office Word 2007 documents into Google Docs format or leave them as they were, but otherwise the job ran smoothly and quickly.

Gladinet multiple=

Once I’d run the backup, it was easy to go into the explorer and confirm that the files had, in fact, been backed up.

gladinet explorer detail

Though the interface can be a little tricky (between first testing Gladinet and writing this review I forgot about how a few things worked), the product is versatile and does what it claims to and more than I used it for. (You can schedule backups or use Gladinet for continuous backup.) And, of course, if you don’t want to customize backups, you can use the simpler options and the system tray interface. The hardest thing may well be setting up your cloud storage accounts, as true cloud storage is still much more the province of geeks than online backup is.

Contest

For those who have hung in there in my absence, I have two free licenses of Gladinet Desktop Pro to give away. The two best (meaning most creative and entertaining) answers to the question “Why did Microsoft call its cloud storage Blob?” will win. (The judges are the Ur-Guru and me. Criteria entirely subjective.) Post your answers to the comments. You have as long as I was late to enter.

Hey! You! Get off of My Cloud!

Friday, February 5th, 2010

Review of the 3X Remote Backup Appliance

3x_systems_private_cloud_backup_appliance The headline of the pitch I received back in November read “3X Systems Launches Private Backup Cloud Appliance for SMBs”. The notion of a “private cloud” intrigued me, so I decided to follow up. (Besides, no one quoted in the press release used the word “excited,” so they get extra points.)

We all talk about “the cloud” as if it’s some amorphous collective up in the sky somewhere, but none of these “cloud computing” services actually operates among the cumulus and cumulonimbus. Your data is not floating around among the raindrops or waiting to crystallize as snowflakes. All it means to use cloud services is that instead of installing the software on servers in your office building, it’s on servers in someone else’s data center, and you access it through the Internet. Cloud storage puts your data onto disks in a similar data center (or more than one, for redundancy), instead of on a backup drive in your office.

With most cloud services, you rent rather than owning—though with companies like Google making so much available for free, consumers may forget that there are costs involved, the same way they seem to forget that there’s actual hardware involved.

With the 3X Remote Backup Appliance, you become your own online backup service.

Now, if you were geeky enough, you could find a way to do this without a special device. Personally, I’m not geeky enough. And I’m pretty geeky, relative to most people I know. So I think the 3X RBA is a great idea for three reasons.

  1. The biggest disadvantage of online backup is the slow speed of data transfer over the Internet. Because the 3X makes its initial full (or “seed”) backup over the local network, it’s much faster than typical online backup services. (How long did it take me to upload my 2 GB backup to MozyHome Free the first time? 12 hours? And that over a cable connection.)
  2. Many businesses—and even individuals—want to be sure of just who has access to their proprietary or confidential data. Running your own online backup service, with only your own company’s data on it, gives you complete control.
  3. If you have half a dozen or more computers to back up, the monthly or yearly cost of most online backup services is going to start to add up pretty quickly. Many of them charge per computer rather than by the amount of data backed up.

So I arranged to get a product demo and an interview with 3X CEO Alan Arman and some of the team, and also to get an evaluation unit to check out. The demo was very straightforward: it certainly looked easy enough to use. But things are often a bit different in real-life situations. I wanted to see whether the 3X would really be as easy to set up and operate as it seemed to be. (After all, the CloudPlug was harder than it appeared.)

rackmount 3X 500 series The evaluation unit arrived on January 20th—the same day as my mother’s SaveMe drive. I was surprised at the size of the box. Based on the photos I’d seen, I was expecting something more the size of my Buffalo Quattro. This box was square and flat.

When I got home, I found out why. The 3X comes in two form factors: the cube, for desktop use, and the 1U rackmount model. Guess which was in that box.

“They sent you a what?” the Ur-Guru said. “Didn’t you go batty from the 40x40mm fans in a rack model!?”

To be fair, Richard Keggans at 3X offered to ship me a cube version instead when I told him about the mistake, but once he assured me that I could still use the rackmount model without a rack, it didn’t seem worth replacing it when I was only going to be using it for a few hours anyway. (The PR spokesperson, who has perhaps never been in a server room, said “We didn’t think it would make a difference to you.” Ha. I think he just wanted to be sure they’d get their $2500 device back.)

Anyway, for anyone else who’s never been in a server room—it’s not so much that the fan noise is loud. It’s not actually louder than, say, my space heater, which is also an electric fan. I did not really need to warn the neighbors to run for their earplugs before powering the thing up.

It’s just that there’s something about the quality of the noise that causes instant brain death. You can tell immediately why people lock these things in cages behind heavy doors inside secure buildings miles away from where they do their actual work. Even the Ur-Guru doesn’t work with rack-mounted systems, because the noise would be too much even for him if he had them in his home office.

So if I were a real customer, I would have bought the cube model, which Richard says is “almost silent.” And it probably is, too, because it isn’t being stored less than an inch from some equally hot device above and below it. The Buffalo Quattro, which has more hardware (though less software) in it, makes very little noise. (Interestingly, the two draw the same amount of power.)

Installation

Even given the awkwardness of having the wrong version of the device, it was easy to set up the 3X. (The printout of the Quick Start Guide was helpful, too.) Plug in the power cable, connect the Ethernet cable to your router, and turn the monster on. Then insert the memory stick with the 3X admin software into your computer and let it run. Then reboot your computer, and start up the 3X Systems Admin tool again. Your device should automatically appear; just select it and choose “Launch Manager.”

That takes you to the web interface, where you do all the sophisticated stuff, including downloading the client software so you can back up individual computers to the device.

3X Backup Manager

The critical thing at this juncture is to set up port forwarding—something I don’t think I’ve talked about since I reviewed ION Backup. I obviously hadn’t done anything with it since then, because I still had that port set up to forward. (Oops.) It took me a while to find the right screen in my router admin, but eventually I found what I needed, and it only took a minute to set it up after that.

Single Port Forwarding

There’s a connectivity check feature in the web manager tool for the 3X, so you can check to make sure that it’s possible to reach your device from outside your local network. This is important if you’re going to actually use it for its intended purpose as an online backup device. (And if you aren’t, why are you paying so much money?)

I then downloaded the client and set it up on my netbook. This worked pretty much the same as installing any other backup software. Once it’s installed, however, you have to get a key from the administrator (provided in the backup manager, above) and then the administrator has to approve you. The administrator can also create backup sets and set quotas for client computers.

3x-backup-registration

I’d read some of the instructions for creating backup sets while waiting for Enna to reboot after the initial 3X admin tools install, so I figured I was all set to define my backup set and go. I did run into one small issue, however: when I clicked “Edit” under the “Backup Sets” tab, the top of the window ran off my 1024 x 600 netbook screen.

3x-backup-client-interface

I was still able to create a backup set that would copy everything on the C:\ drive except for the Recycler, System Volume Information, Windows folder, and Program Files. When I eliminated those, I was left with 9 GB of data, and the 3X copied them quickly (if loudly) while I had lunch.

I’m not really in a position to test the deduplication and some of the other features of the 3X, but deduplication is the reason you can back up several computers to a 100 GB drive. All enterprise systems rely on it these days, but almost no SOHO systems offer this level of dedupe. (And if you have people sending e-mail attachments to others in your company, you’re racking up duplicate files fast, never mind duplicate software installations.)

So I’ll leave those to another reviewer and just say that my “seed” backup went smoothly. It was time to test the remote backup.

Remote Backup and Restore

The normal way to use a 3X is to disconnect it from the local network when the seed backups are complete and move it to another location—the business owner’s home, a different office, or even a cage rented in a data center. Then you plug it in and hook it up to the Internet, and the client computers will back up incrementally according to their schedules. Because only the changes are backed up, this doesn’t take much bandwidth or time.

I don’t have an alternate office location, much less a pocket data center, so I took my computer elsewhere instead. I headed over to the local public library to see whether I could back up and restore data using their free wi-fi connection.

I didn’t have so much luck backing up. I’m not sure why, but the program just seemed to sit around endlessly calculating the size of the backup. (Not very large: I had downloaded a whole two image files so there would actually be some changes to back up.) This might have had something to do with the very slow wi-fi connection, or it might just be that the backup client has to scan the entire machine before running. My battery and patience were running low, so I aborted the backup and tried a restore, instead. The main thing, in my mind, was to confirm that I could connect to the 3X from outside my network.

Clicking the “Restore” button gave me the option to restore one file or several. I then had the chance to browse to my chosen file and to select which backup I wanted to restore from. (I’d only made one, but had earlier opted to save 10 backups.) In order to be sure the restoration had worked, I restored it to a different directory.

And work it did. It took a little longer than I might have expected for a file of modest size, but I don’t think that was the fault of the 3X. That wi-fi connection was really slow. Not quite shades-of-dial-up slow, but I am reminded of the early days of the Web and the expression “Graybar land.”

So the verdict is: it works. You really can become your own online backup service provider. To make it work, of course, you need a place to set the 3X up. If you’re like me and work out of your home, you might need to make an arrangement with a colleague to each keep the other’s remote backup appliance. But the ideal customer is not the home office user, but the person who runs a small office with multiple computers—enough of them that paying for Mozy Pro for a year would more than cover the cost of buying one of these.

And now that that’s done, I can shut it off and hear myself think again.

PS In the week between the time I wrote this and the time I published it, 3X was named one of the 20 Coolest Cloud Storage Vendors by Computer Reseller News.

A New Way to Back Up WordPress

Friday, January 15th, 2010

Automatic WordPress Backup logo

If you search the WordPress plugin repository for “backup”, you’ll get—as of today—195 results. I wrote about two of those plugins, WordPress Database Backup and WordPress Backup by BTE, just about a year ago, and installed them on all 8 of my own WP sites, as well as insisting that my clients use WP-DB-Backup at minimum.

Both of those plugins back up different parts of a WordPress installation and then either save it there on the server or e-mail it to the admin. I get a lot of e-mails with database backups, as you can imagine. These aren’t large files, and it’s not too time-consuming to save them with other client files and let them get backed up as part of my regular backup routine.

But the BTE plugin backs up your uploads, plugins, and themes directories. And those can start to get pretty large after a while. Not large in absolute terms of how much room I have on my hard drive or backup drives, but large in terms of what it’s convenient to receive by e-mail, especially multiplied by eight or more. And then there’s the fact that the mail server for Author-izer.com, my primary business website, absolutely WILL NOT accept the plugins backup file, even though it’s a ZIP. It believes that file is full of malicious code out to attack me, and refuses it. (Ta ever so, mailer-daemon.) And then there’s the lack of versioning, because each week’s backups of those directories has the same name. These are minor annoyances, but real.

Now there’s a new plugin that combines the functions of these two stalwarts, with a few extras besides: Automatic WordPress Backup, sponsored by Melvin Ram’s Web Design Company, developed by Dan Coulter.

AWB lets you schedule daily, weekly, or monthly backups of your database, your wp-config.php file, your wp-content folder (themes, plugins, and uploads), and even your .htaccess file. Instead of e-mailing them to you, it uploads them to Amazon S3.

aws_logoS3 stands for “Simple Storage Service.” It’s not actually quite as simple as all that, but the idea is that you only pay for as much storage and bandwidth as you actually use. Since a typical WordPress installation—even with a lot of plugins and uploads—isn’t very large, backing up via S3 shouldn’t cost more than a few cents each month.

Before you install the plugin, go to Amazon S3 and sign up for an account if you don’t have one already. (Signing up is free.) Once you get that confirmed, go to “Security Credentials” under the “Your Account” tab to get the information you’ll need to configure the plugin.

WDC-optionsThen log into your WordPress dashboard and install the plugin normally. There’s a handy YouTube video that walks you through installation over on the AWB website. This is a nice touch. I just wish Amazon had done the same for S3! Once you activate AWB, you’ll be prompted to configure the settings. If you need to find them later, they have their own options submenu at the foot of the right sidebar.

 

Fill in your AWS Access Key and Secret Key, create an S3 “bucket” (the Ur-Guru was a bit disparaging about that term) to store your backup in, and decide what you want to back up, how often, and when to get rid of old backups.

AWB-settings

I like both the option to automatically delete old backups and the option to make backups only once a month. There are sites that I don’t update any more often than that, even though I know I should.

When I first installed AWB on the test blog over at the Podcast Asylum, it didn’t seem to work. After you hit “Save Changes and Back Up Now,” you see a message telling you that there will be a link to download your most recent backup when you come back to that page—but there was never any link.

That was when I realized I didn’t know how to see what was on my Amazon S3 server. Amazon’s own site wasn’t too helpful; their AWS Management Console doesn’t work with S3 yet. Fortunately, there are plenty of other tools to let you get access to your S3 account. I picked S3Fox, a plugin for the Firefox web browser. Once I’d installed that, I was able to confirm that while my “testblog” bucket had been created, there was nothing in it.

Yet when I installed Automatic WordPress Backup here on the FileSlinger Backup Blog, it worked just fine. Was this a hosting issue, I wondered? (The Podcast Asylum site is on Dreamhost and the Backup Blog is on GoDaddy—and I don’t actually recommend either of them for WordPress hosting these days.) I got in touch with Melvin Ram, who walked me through installing the development version of the plugin, due for release next week sometime.

That fixed the problem: after clicking “Save Changes and Back Up Now,” I saw the following message:

AWB-restore-interface

That “Restore from a backup” tab is new in the development version; in the current version, 1.0.2, you have to download the backup and restore it manually. Not quite all the bugs are out of the restore process yet, though. I double-checked in S3Fox, and sure enough, the ZIP file was there.

S3testblog

I did notice, however, that while the ZIP file contained my wp-co
nfig.php file, my .htaccess file, and my wp-content folder, it was missing my WordPress database. (So was the one from fileslinger.com.) So I might want to wait through a few more development versions before I completely replace WP-DB-Backup with Automatic WordPress Backup.

Nevertheless, I think WDC has made a great start with this plugin, and that it’s going to be extremely useful once they’ve got the bugs out.

Putting Your Data in Danger

Saturday, October 24th, 2009

Would you entrust your data to a company called “Danger”? Microsoft and T-Mobile did. And it was your data, if you were a Sidekick user.

The adventure began on October 10th. The headline in TechCrunch read “T-Mobile Sidekick Disaster: Danger’s Servers Crashed, And They Don’t Have A Backup.” Jason Kincaid, author of the TechCrunch article, was absolutely scathing on the subject:

This goes beyond FAIL, face-palm, or any of the other internet memes we’ve come to associate with incompetence. The fact that T-Mobile and/or Microsoft Danger don’t have a redundant backup is simply inexcusable, especially given the fact that the Sidekick is totally reliant on the cloud because it doesn’t store its data locally.

I’ve never used a Sidekick, but a mobile device that doesn’t store phone numbers, etc locally at all seems bizarre, and in fact I’m not sure that statement is quite accurate, given suggestions in other articles that if you keep the Sidekick charged and turned on, you would at least save anything in its current memory.

But then, I still have a “dumbphone,” so what do I know about how these things work?

By October 11th, T-Mobile had posted the following discouraging notice on its user forums:

Regrettably, based on Microsoft/Danger’s latest recovery assessment of their systems, we must now inform you that personal information stored on your device — such as contacts, calendar entries, to-do lists or photos — that is no longer on your Sidekick almost certainly has been lost as a result of a server failure at Microsoft/Danger.

Not surprisingly, the media has been all over the story. “Microsoft has said that the hardware failure that caused the problem took out both the primary and backup copies of the database that contained Sidekick users’ information,” Ina Fried wrote on October 12th. “But the question remains, why wasn’t there a true independent backup of the data?”

That would certainly be my question. Rafe Needleman, also writing for CNET on October 12th, concluded that you can’t trust the cloud because you can’t trust the people running it. The problem, in other words, is not one of technology. Tech support staff often refer to problems that start “between the keyboard and the chair.”

If it’s possible to create independent, redundant backups in your own data center, it’s possible to do it in the data centers used by cloud computing companies. The only difference  is that you can’t walk down the hall and see that they’ve done it. Some people will slack off when you aren’t there to hold them accountable, but that’s not true of everyone. As Lance Ulanoff concluded in his October 13th article, “Don’t Blame Cloud Computing for the T-Mobile Mess,”

Obviously, something went very, very wrong with T-Mobile and Microsoft’s Sidekick data set-up, but let’s not throw out the baby with the bathwater (or the cloud with the rainwater). The cloud isn’t the problem. Instead, I blame the people—as always.

But the Register, with typically British enthusiasm for a pun, declared “Danger Lurks in the Clouds” on October 18th. The danger is that all mobile devices will rely increasingly on a working connection to provide any functions at all. Nevertheless, author Bill Ray concludes:

Cloud-based servers are still more reliable than most of the kit knocking around users’ homes – the life expectancy of an Apple Time Capsule, for example, is just over 17 months according to the Time Capsule Memorial Register, so even those who are backing up locally shouldn’t be too smug.

That article concludes, in the Register’s usual tongue-in-cheek fashion, that paper is the only safe storage medium.

By October 20th, Microsoft and Danger had in fact been able to restore some of the data, as reported in CNET and on T-Mobile’s user forum. That’s a happier ending than Sidekick owners had been led to expect. I’m glad they got their data back, but if I’d been affected, I’d want more.

I’d want to know what the company was going to do differently from now on so that this wouldn’t happen again. And I’d want a free application that would let me back up all my contacts, calendar entries, etc, onto my computer. It wouldn’t even have to sync with Outlook or Google or Mac-whatever, as long as I’d be able to restore the data to my mobile device.

Finally, as an occasional naming consultant, I want to see Microsoft Danger rebranded. What incentive do you have to entrust something valuable to a company called Danger? What incentive do employees of a company called Danger have to be careful? Danger is a fun name for a company that makes games, but for data storage, it just sounds unreliable.

FileSlinger Backup Blog at Blogged

 

Blogging Blog Directory
BlogWithIntegrity.com
Google Ads