Random blog links volume ump-teen
It's time yet again to post a bunch of blogs that have randomly sent visitors here via blogger's random blog feature thing.
Dreadful photos of Rita's destruction
IsFullOfCrap, a Houston based blog, has on-the-scene photos of the destruction caused in the neighborhood by hurricane Rita. I mean, Jeez, look at all that flooding.
Seriously though, they have dozens of links to other local blogs and websites if you're interested.
Microsoft actually does something cool
Microsoft has a certain knack for following other people's lead at times, such as their Virtual Earth app that comes on the heels of the enormously popular Google Earth, but I actually have to hand it to Microsoft for something. Their Virtual Earth interface allows you to view some traffic cams in Texas.
I'm such a geek that I almost wish I didn't have a date tonight so I could sit here and watch the storm roll in.
Of course it will be dark there soon, but there's enough satellites in space dedicated to keeping me informed that I could keep myself busy.
I remember playing with imagery like this before it was cool!
Check out the Houston Chronicle's local blog and I highly recommend Kris Alexander's blog. He works emergency management around Austin, Texas and is really sharp even when a hurricane isn't bearing down on him.
The maps at the Perry-Castañeda Library are second to none, SailWX is a good place to track the locations of ships around the globe, Rigzone tracks, you guessed it, oil rigs and the NRL Monterey Marine Meteorology Division has some of the best maps and a catalogue of weather imagery that rivals NOAA's.
(click to enlarge)
Here she comes
I really hate to jump on the Rita hysteria bandwagon, but she does take a good picture.
nothing this government of ours does makes sense anymore. Laura Rozen points us to a story of how the Food and Drug Administration hired a veterinarian as head of the Office of Women's Health. A week later they are announcing that someone else has the job and are denying their first announcement although staff tell a different story.
You may ask yourself...
why would the Bush administration set up a internet-based fundraising effort for the Iraq war? But then you probably hadn't yet heard that between one and two billion dollars of United States taxpayer money has vanished from the Iraqi Ministry of Defense.
It would only take 10,000 Bush Pioneers to donate enough money to his web drive to pay it all back to the American taxpayer, so I assume that is his goal. Surely it must be something they're doing for the American public. It certainly wouldn't be anything they could benefit from. Could it?
But hey, it's all ok. Don't worry about that $200 billion blank check Karl Rove is getting to rebuold the Gulf Coast. Just because we have record deficits and pay for the war in Iraq by borrowing money from communist countries on a daily basis doesn't mean the fools in charge aren't wise when it comes to spending our money. What are you, anti-American?
Seriously, the world just stopped making sense a long time ago. I put this date at January 20, 2001.
NOAA imagery interface re-tweak
UPDATE:Stupid me, I wrote the script with a hard link and changed it to a variable at the last minute without testing it. If you downloaded the script and it didn't work, all you have to do is edit one word in the install script. On Line four, you have to change "$main" to "$MAIN", just uppercase the sucker and away you go. I fixed it, uploaded it the zip file again, and updated the link below, so it's all good now.
After I screwed up and deleted four hours of tweaking the NOAA directory structure to make it more useable, I took the short route to "recovering" my work. Instead of writing another long script that could be downloaded to do it all automatically, I simply did all the HTML editing, relinking etc etc, zipped it up, and put it up for download.
This is just a skeleton. It contains the complete directory structure and HTML files pointing to the geographical regions of the imagery. The only thing missing is the 10 gigabytes of jpg images, you'll have to supply that on your own.
If you have the imagery and use UNIX, there is a BASH shell script included that will move all the images to their proper directories. If you're on windows, you're screwed.
Here's how it works. Imagine we want to find an image of a certain location in New Orleans. The NOAA site starts on the first page and goes to the New Orleans page right next to it, which, like all pages, points to a single directory where thousands of images are stored. You can click an image, but the map is so poor that you never really know if you're close or not. If you've downloaded all the images, you have to pick the New Orleans images from a directory of 8,000.
My tweak starts on a modified start page that looks like this (I had to zoom the image out to take the snapshot, so it looks funky):
Once you click on one of the box for New Orleans you are taken to an HTML page inside a /Louisiana/New_Orleans/ directory and all the images for New Orleans are right there in that directory. The New Orleans HTML page still points to them, but you don't have to worry about wading through thousands of Louisiana, Mississippi and Alabama images to zero in on New Orleans.
Basically, it just subdivides the images into their own geographical subsets.
Previously, I posted a simple tweak of the start page for the imagery that lists all the regions. In this new tweak, there is a directory for all 41 places on that list, so if you're looking for Dauphin Island, Alabama imagery, it's all together in it's own folder.
Funny thing about running KDE on Linux. There is a option for "move to trash" and one for "delete". Delete is permanant and I hate "recycle bins" so I always choose delete. In three years I have never deleted anything I didn't want to delete. Never.
As you may have noticed, I like playing around with satellite and aerial imagery. When hurricane Katrina hit, NOAA started taking aerial photographs of the affected areas in Louisiana, Mississippi and Alabama. They took and released over 8,000 images in all, something like 8 gigabytes.
But one of the problems with NOAA's imagery is that it was released without being processed, north is not up and the distortions inherent with photographs taken from above are not ironed out. Worst of all, their interface and file structure totally sucks.
I finally sat down this morning to work on the problem of this imagery's crappy structure. I took a bunch of NOAA HTML image maps of the photographed areas, divided everything into it's own seperate directory, made child directories based on geographic locations, moved all relative images to these directories so you would only have to look through a couple dozen images to find the one you need instead of 1000. I added thumbnails, updated all the links in the HTML maps, and did about 400 other things that took forever and helped simplify working with this imagery by several orders of magnitude.
As I did all this, I was constantly writing short shell scripts to make lists and edit files accordingly, copy and edit HTML, move images and create links in the general maps, etc etc, so during the process I wrote all this down in a single shell script so that anyone who has the imagery and original NOAA legend on their harddrive can just run the script and it will do all this for them.
As I often do when writing scripts and changing things around, I did all this in a temporary directory with copies of the originals. All in all it took a little over four hours to finish, and boy was I glad to be done.
As soon as I was done and tested everything to make sure it worked, I went up one directory to where the original data was located and thought "Guess I don't need the temp folder anymore..." and, as soon as I right clicked and hit "delete", I realized all my work was still in there!
It was all gone instantly. On Windows this would not be a problem because when Windows deletes something, it really just hides it, changes the first letter of it's name, and tells the system that the space can be overwritten. It's still there 100% as long as the system doesn't need to use that exact space.
But I use the reiser journaling file system which is almost impossible to recover data from. I am currently trying to rebuild the directory tree, a method that is estimated to "possibly recover some data but it may be horribly corrupted." It's going to take a while on my 160 Gig hard drive though. There is, it says, about 18,000 seconds left to this process.
Update: Wow, it finished as I was writing this. Those estimates are never right. Unfortunately, it is not possible to recover a single file that I need. Oh sure, I can recover a lot of things that I deleted six months ago, but everything from this morning is corrupted.