Jump to content

Check your cache pages for smoke and water damage!


Recommended Posts

So.... What did you do with your enforced mini vacation from geocaching.com?

 

Lemme see...

 

I tried to get folks to buy Geomate.jr units since they come preloaded with 250k caches.... I'm trying to buy stock in geomate.jr. :laughing:

 

I started a rumor that Jeremy was eating poprocks and drinking soda next to the server and that was the reason for the outage. :laughing::D

 

I uncovered the conspiracy between REI and Geomate.jr to take down geocaching.com for an extended period to sell more geomate.jr units using Greenpeace and PeTA as pawns to take out the server facility. B):anitongue:

Link to comment

Helped my daughter with her 4H project. She learned how to use the table saw, miter saw and drill press.

 

27449587-37b9-43b7-b1e6-b335c887eac6.jpg

 

 

I also went into the woods to find a big log to hollow out and use as camouflage for a cache I'm planning. No luck.

Edited by BlueDeuce
Link to comment

Very nice bird house!! Is it going to be a cache? :D

 

I played Guitar Hero with my son. :anitongue:

 

Helped my daughter with her 4H project. She learned how to use the table saw, miter saw and drill press.

 

27449587-37b9-43b7-b1e6-b335c887eac6.jpg

 

 

I also went into the woods to find a big log to hollow out and use as camouflage for a cache I'm planning. No luck.

Link to comment

I cried a little. Then I bought swag for a couple caches I want to get out. I came home and tried to log on so I could submit my caches. Whined a little to my muggle husband. Wondered if playing Warhammer 40K with him would be a sufficient trade to get him to cache with me. Then we went to see the fireworks in a great new park with a lovely view of the Arch. Needs a cache. Came home and went to bed.

Link to comment

I went through withdrawl. It started with me waking up to see that GC.com was down, so I decided it was time to go back to sleep, but I couldn't! I broke out into a cold sweat later on in the day and followed it up with some pain medication that I got from a buddy of mine a few days ago. After I ate some of those I was flying high and I could bear to live without GC.com.

 

Of course that's just a mini fabricated story. I loaded up a PQ from a week ago and went to find caches I didnt have time to get last weekend. :D

 

Is it bad that I had to register on a WI Geocaching board to get updates on the problem? (until I found Jeremy's twitter page)

Edited by steedaq155
Link to comment

Glad to see everything back online.. Some of the logs are a little moist, but with all the rain we've been getting in the north-east, It's to be expected..

 

 

Someone just mentioned, on the local caching community board, that the US government is replacing the plastic DECON containers, with zip-loc type bags.. I wonder if they're more leak-proof now? I mean, If the packet the DECON kit is in, leaks, doesn't that defeat the purpose?

 

Mental note for the next time with the server: Redundant UPS! I keep a 500WVA plugged into the 1250WVA unit, between the computer & the wall socket.. When the 1250 dies, the 500 kicks in. It actually remained running for close to 3 hours! (well, it was a low-draw computer.)

Link to comment

 

Mental note for the next time with the server: Redundant UPS! I keep a 500WVA plugged into the 1250WVA unit, between the computer & the wall socket.. When the 1250 dies, the 500 kicks in. It actually remained running for close to 3 hours! (well, it was a low-draw computer.)

 

I was kinda wondering about that when I read the announcement. Most serious data centers have alternate power ready to go. My company had a UPS good enough to run our 30 servers for 10 min and a backup generator that kicks on automatically and can run indefinitely off of natural gas.

Link to comment
I was kinda wondering about that when I read the announcement. Most serious data centers have alternate power ready to go. My company had a UPS good enough to run our 30 servers for 10 min and a backup generator that kicks on automatically and can run indefinitely off of natural gas.

One of the news stories said that one of their generators was fried in the fire, and the other was under water from putting the fire out. That's why they had to bring in truck-mounted generators and wire them into the building. That takes time.

Link to comment
I was kinda wondering about that when I read the announcement. Most serious data centers have alternate power ready to go. My company had a UPS good enough to run our 30 servers for 10 min and a backup generator that kicks on automatically and can run indefinitely off of natural gas.

One of the news stories said that one of their generators was fried in the fire, and the other was under water from putting the fire out. That's why they had to bring in truck-mounted generators and wire them into the building. That takes time.

 

I would have thought they would use a state of the art Fire Suppression System that they typically use in Data Centers, such as Halon or an Aero-K generator. not water.

Link to comment
I was kinda wondering about that when I read the announcement. Most serious data centers have alternate power ready to go. My company had a UPS good enough to run our 30 servers for 10 min and a backup generator that kicks on automatically and can run indefinitely off of natural gas.

One of the news stories said that one of their generators was fried in the fire, and the other was under water from putting the fire out. That's why they had to bring in truck-mounted generators and wire them into the building. That takes time.

 

I would have thought they would use a state of the art Fire Suppression System that they typically use in Data Centers, such as Halon or an Aero-K generator. not water.

 

Except that the generators are not located in the data center. I believe they are in the parking garage in that building.

Link to comment

I would have thought they would use a state of the art Fire Suppression System that they typically use in Data Centers, such as Halon or an Aero-K generator. not water.

 

The fire was in a vault in the underground parking if the stories are correct. It sounds like the generators may have been on the same loop of the fire suppression sprinklers, or they were flooded by the runoff. Oops.

Link to comment

I would have thought they would use a state of the art Fire Suppression System that they typically use in Data Centers, such as Halon or an Aero-K generator. not water.

 

The fire was in a vault in the underground parking if the stories are correct. It sounds like the generators may have been on the same loop of the fire suppression sprinklers, or they were flooded by the runoff. Oops.

 

Lack-of-planning. :D

Link to comment
I would have thought they would use a state of the art Fire Suppression System that they typically use in Data Centers, such as Halon or an Aero-K generator. not water.
The fire was in a vault in the underground parking if the stories are correct. It sounds like the generators may have been on the same loop of the fire suppression sprinklers, or they were flooded by the runoff. Oops.
Lack-of-planning.

In a former life, I was an electronics technician in the Navy. Systems could have multiple redundancies and still fail.

 

One instance I remember was when a power supply for a transmitter went down. There was a very loud hum (enough to vibrate the floor), the lights in the facility dimmed, a loud ka-thunk, I heard the normal switches and relays switching over for the other transmitter to automatically take over, the normal fault alarm goes off. Nobody moved, we just kind of looked at each other, but then in less than a second there was another cha-thunk and the lights went out. The emergency lights kicks in. The major fault alarms went off. Everybody ran. The site was down. Even as we were moving we could feel the massive diesel generators starting to fire up. Within seconds power was restored. However, both transmitters were down.

 

For about 15 minutes folks were wringing their hands as the whole complex was, in essence, down as we weren't transmitting. We could receive, but that's it.

 

Long story short: even though the transmitters were completely separate with the only commonality was the waveguide switches, control circuits, and power feeds, one not only had a profound effect on the other, but the whole site. The problems was a loose feed wire into the 3 phase transformer. It came loose and started arcing. The inductive kick was so strong and drew so much power that the system couldn't feed the amperage. The result was a catastrophic drop in voltage causing a cascade of faults back toward station power. 3 hours later the transformer was repaired and the transmitter was back in ready redundant state.

 

So, you see, even planning on a host of redundant systems, you can't plan for everything. Sometimes, stuff just happens.

Link to comment

So, you see, even planning on a host of redundant systems, you can't plan for everything. Sometimes, stuff just happens.

 

I know I'm arm-chair quarterbacking here and my opinion is worth as much as they paid for it (zilch!) but it seems to me that you should be coordinating your emergency systems. Unless there's lots we don't know about what happens (there likely is) then this just looks like incompetence in design.

Link to comment

So, you see, even planning on a host of redundant systems, you can't plan for everything. Sometimes, stuff just happens.

 

I know I'm arm-chair quarterbacking here and my opinion is worth as much as they paid for it (zilch!) but it seems to me that you should be coordinating your emergency systems. Unless there's lots we don't know about what happens (there likely is) then this just looks like incompetence in design.

I am reminded of massive power outages that occur from time to time due to something comparatively minor or unanticipated. Or planes going down because birds were sucked into the engines. Incompetence in design? Not likely. Like CR said, sometimes stuff just happens.

Link to comment

I spent four hours in the afternoon doing a much needed maintenance run on my nasty large multi cache with a geopal and reworking the stages we couldn't find to make it even nastier and larger. :D

I was able to find two stages purely from memory, including a bison in the woods and the final. I don't know why my GSAK "my hides" db didn't have the child waypoints in it, but it will by Monday.

Link to comment

I am reminded of massive power outages that occur from time to time due to something comparatively minor or unanticipated. Or planes going down because birds were sucked into the engines. Incompetence in design? Not likely. Like CR said, sometimes stuff just happens.

 

Building the generator room so the sprinklers drain into it? Sounds like a design flaw.

Link to comment

I didn't know what to think! :D My computer crashed two days before and I had just got the new one up and working, so I checked into the one web site that always works :laughing: , geocaching, and since it didn't work I was lost :anitongue: so we left camp went across the river to Laughlin NV and invested in the casino, one coin at a time.

Link to comment

I would have thought they would use a state of the art Fire Suppression System that they typically use in Data Centers, such as Halon or an Aero-K generator. not water.

 

The fire was in a vault in the underground parking if the stories are correct. It sounds like the generators may have been on the same loop of the fire suppression sprinklers, or they were flooded by the runoff. Oops.

 

Lack-of-planning. :D

 

I suspect that most disaster plans are not fully tested.

 

I was working as a site Systems/Network administrator Hewlett Packard in 1989 when the Loma Prieta Earthquake hit. One of the other HP buildings at another division had a large computer room in the basement. The plumbing pipes two floors above it broke and flooded the computer room with water four feet deep.

 

I was (and still am) on vacation when the GC.com site went down so had my GPS fully loaded with waypoints. I only found one cache yesterday and just logged it a few minutes ago.

Link to comment

I went geocaching, of course! GSAK is great for inputting data into the GPS and Palm! (And replaced a missing geocaching container...)

 

I WANTED to go caching, but I had just purged my GPSrs the day before, and I don't keep my old PQs, and since GC was down, I couldn't get the PQ I wanted, so I had no caches loaded up to hunt :D Gonna have to start keeping an off-line database of some kind, too. Especially since I'm on vacation this week, and have plans to get a LOT of caching done. Now if my PQs will just get sent....

Link to comment

I am reminded of massive power outages that occur from time to time due to something comparatively minor or unanticipated. Or planes going down because birds were sucked into the engines. Incompetence in design? Not likely. Like CR said, sometimes stuff just happens.

 

Building the generator room so the sprinklers drain into it? Sounds like a design flaw.

Generators are in a different building from I can tell. Seems like a reasonable design to me but I am no engineer. Agree that maybe some floor drains would be a good idea. :D

Link to comment

I wish that you had posted this before I went caching this morning, Snoogie. I thought that I was smart because I had not only grabbed my PQ on Thursday, but printed out what I needed on Friday. So the site goes down... yeah, I can't get all angsty on the forums, and that was a bummer, but at least I had what I needed for Saturday morning, right?

 

Wrong.

 

I will say that the coords in my GPS weren't bad. A little condensation on the screen did make it a bit hard to read, but that went away eventually. But I did print out a couple of cache pages... a puzzle and an EarthCache. Those printouts were ruined! The ink bled to the point where I couldn't make out a thing, and whew! did they ever stink with smoke... the kind that only comes from red-hot electrical equipment. At least I got the regular caches, but I was really counting on those other two.

Link to comment

... I don't keep my old PQs, and since GC was down, I couldn't get the PQ I wanted...

 

Gmail is awesome for that. With ~7.3 gigs of storage I won't be deleting any PQs for some time.

 

Oh, CRAP! That's where I have my PQ's sent, too, it didn't even dawn on me to look there for my old PQs! Well, even then, they where still kind of old, like over a month old, but still would have been better than nothing.

Link to comment

Generators are in a different building from I can tell. Seems like a reasonable design to me but I am no engineer. Agree that maybe some floor drains would be a good idea. :)

 

Different building then the sprinklers and they still flooded? MAJOR design issues then. :unsure:

Link to comment

Fire was in a vault where Seattle City Light connected to transformers for the building. Unclear from the news reports who owned the transformers, but Seattle City Light stated the fire was started by "customer" equipment.

 

The sprinklers came on, and there was no loss of life. That's what sprinkler systems are primarily designed for, life safety. Insurance companies will often ask for other fire detection and protection in addition to building code, fire marshall and NFPA recommendations, in an effort to minimize financial loss.

 

Sound to me like the systems worked properly. For commercial reasons, the companies involved could look into preventing the water intrusion into the back up generators, I suspect the water flowed through the conduits form the vault to wherever the generators are.

 

I had no internet access on Friday, as the fire knocked my Verizon DSL off as well.

Edited by seob
Link to comment
So, you see, even planning on a host of redundant systems, you can't plan for everything. Sometimes, stuff just happens.

My favourite "stuff happens" story was from a maintenance tech for Tandem in the early 80s, when Tandem where the only game in town if your app absolutely, positively had to work when hardware broke (which, in those days, was a lot more often than today).

 

He was performing maintenance on a disk drive, where one pack had failed, so the system was running on the spare. This required spinning down the drive and removing the disk pack (you folks under 45 won't have any idea what I'm talking about here). As he lifted the broken disk pack out, a screwdriver fell out of his top pocket and into the drive. Now in this system, the redundant disks were located with one vertically above the other, so the screwdriver landed on the running, backup disk pack... :unsure:

Link to comment
So, you see, even planning on a host of redundant systems, you can't plan for everything. Sometimes, stuff just happens.

My favourite "stuff happens" story was from a maintenance tech for Tandem in the early 80s, when Tandem where the only game in town if your app absolutely, positively had to work when hardware broke (which, in those days, was a lot more often than today).

 

He was performing maintenance on a disk drive, where one pack had failed, so the system was running on the spare. This required spinning down the drive and removing the disk pack (you folks under 45 won't have any idea what I'm talking about here). As he lifted the broken disk pack out, a screwdriver fell out of his top pocket and into the drive. Now in this system, the redundant disks were located with one vertically above the other, so the screwdriver landed on the running, backup disk pack... :unsure:

 

Back in the good ol days of ceramic disks....

Link to comment

I had just signed up and hadn't received any pocket queries, and I had a group of people coming over to go geocaching, so I was scrounging around for any scraps of paper that I had written some down on before! Luckily I had 5 that I hadn't found yet. When I finally got a PQ email (still missing 5) I made sure to put that set of 500 right on my GPS so next time I'll have others to find!

 

And with everyone talking about how they couldn't do anything because "stuff happens" this is the exact reason why you have a Disaster Recovery site. It doesn't cost much to colocate a server at a datacenter on the other side of the country! Heck even a VPS account would suffice, they could just not allow updates while running at the DR site, but you could query the DB for cache's at least!

Edited by nosebreaker
Link to comment

Generators are in a different building from I can tell. Seems like a reasonable design to me but I am no engineer. Agree that maybe some floor drains would be a good idea. :o

 

Different building then the sprinklers and they still flooded? MAJOR design issues then. :unsure:

Generators and sprinklers in one building. Servers and radio/tv stations in another building. Sheesh. Quit trying to have so much fun making fun of my incomplete explanations already. :)

 

I think we both know this wasn't due to any fault on the part of Groundspeak.

 

Verizon was down in the area as well. They certainly should have been able to plan for redundancy considering thier size. If they didn't think the setup was a problem why should little Groundspeak?

Link to comment

 

 

Verizon was down in the area as well. They certainly should have been able to plan for redundancy considering thier size. If they didn't think the setup was a problem why should little Groundspeak?

 

Verizon did have some sort of contigency because my DSL was back up by about 10 AM on Friday.

Edited by seob
Link to comment

 

 

Verizon was down in the area as well. They certainly should have been able to plan for redundancy considering thier size. If they didn't think the setup was a problem why should little Groundspeak?

 

Verizon did have some sort of contigency because my DSL was back up ny about 10 AM on Friday.

Oh sure, just pile on. :unsure: I would hope that a company with a bazillion customers would have more ways to restore service.

Link to comment

 

Oh sure, just pile on. :unsure: I would hope that a company with a bazillion customers would have more ways to restore service.

 

I wasn't piling on. That wasn't my intention. Sorry if I offended. :) I agree with you that GC.com did everything within their power and resources to get back online. They are obviously a small operation and I think their level of backup seems appropriate.

 

Verizon absolutely needed to have a backup and evidently they did. GC being down is an annoyance, my Verizon DSL service being down is a major disruption, especially for my wife!

Edited by seob
Link to comment

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.

Loading...
×
×
  • Create New...