Jump to content

Dedicated Server For Premium Members


lowracer

Recommended Posts

We had a couple of DL580 's laying around.  They're cute if you put a good RAID system in them.  But don't confuse them with big and bad  :P

That's really funny - I was looking at those a few weeks ago and even printed out their spec sheets (they're still on my desk). :D

The ES7000's provided me with much OS-architecture entertainment in my pre-geocaching days. (Hmm. They might have been under NDA then, so scratch that.) Once you get into Real Computers where everything including memory and CPUs are hotplug, and you have 32 or more processors honking along in the same cabinet, things get really wild.

 

I don't know what you were running before, but in the interest of not turning this into Judge Reinholds demo of the Dominator MX-10s, I'll say the 580's are nice boxes. The off-cache performance is disappointing on them, but that's the point of the 4MB caches that are in vogue now on Xeons. Pair them up with a strong I/O system talking to a RAID cabinet with lots of spindles and a bucketful of memory, and you get hail-mary's for lots of database sins. Ironically, it also makes it harder to observe and measure those sins when everything happens in core instead of on platters because it's faster.

Link to comment
We had a couple of DL580 's laying around.  They're cute if you put a good RAID system in them.  But don't confuse them with big and bad  :D

That's really funny - I was looking at those a few weeks ago and even printed out their spec sheets (they're still on my desk). :D

 

Unfortunately, it'll take a few more premium members before we'll be able to get one of those.

 

:P Elias

Just make a few more forums Members Only. :D

Link to comment
Of course, a better option may be to have a "cache logging" page where if you have the GC# of the cache, you can log your find on a page similar to the logging page now, but instead of requiring you to access the cache page or returning you to the cache page after your log, you're still left at a static page like the front page.

I built something like this a while ago. It saves me a ton of time in logging caches, especially when I can get my finds and notes directly out of cachemate.

 

Check out my Geocaching Express Logger here and let me know what you think.

Link to comment
I don't know how GSAK links to the site, but you can go directly to the log page and bypass the cache page entirely.

 

e.g.

http://www.geocaching.com/seek/log.aspx?ID=146042

Geocaching.com Add New Log=http://www.geocaching.com/seek/log.aspx?ID=%gcid

 

is what I added as a Custom URL to GSAK. I right click any cache in GSAK and select Geocaching.com Add New Log and it opens the log page. Neat and quick!

Excellent! This is a very handy tip for those using GSAK. Thanks.

Link to comment
Of course, a better option may be to have a "cache logging" page where if you have the GC# of the cache, you can log your find on a page similar to the logging page now, but instead of requiring you to access the cache page or returning you to the cache page after your log, you're still left at a static page like the front page.

I built something like this a while ago. It saves me a ton of time in logging caches, especially when I can get my finds and notes directly out of cachemate.

 

Check out my Geocaching Express Logger here and let me know what you think.

i am sooo checking that out next time i have to log a cache.

Link to comment
Of course, a better option may be to have a "cache logging" page where if you have the GC# of the cache, you can log your find on a page similar to the logging page now, but instead of requiring you to access the cache page or returning you to the cache page after your log, you're still left at a static page like the front page.

I built something like this a while ago. It saves me a ton of time in logging caches, especially when I can get my finds and notes directly out of cachemate.

 

Check out my Geocaching Express Logger here and let me know what you think.

Neat, I was wondering how you were doing the prefix thing in your recent log entries. Now I know. Does the cachemate log file contain log entries made in the field, or are you just grabbing the waypoints? (A moot point, since my palm died and I can't justify buying another one.)

 

--Marky

Link to comment
eat, I was wondering how you were doing the prefix thing in your recent log entries. Now I know. Does the cachemate log file contain log entries made in the field, or are you just grabbing the waypoints?

It grabs the waypoints, date, time, cache name and notes from cachemate.

 

For me it's usually something brief and mostly incoherent that I entered from the field so I can remember it when I write up the full log later. Occasionally it's "Marky is evil." :lol: I then edit them in the logger and post them to the site. It's nice to edit them separately from the log page because sometimes when you try to post a find and the site times out, you lose it.

Link to comment
We had a couple of DL580 's laying around.  They're cute if you put a good RAID system in them.  But don't confuse them with big and bad  :lol:

That's really funny - I was looking at those a few weeks ago and even printed out their spec sheets (they're still on my desk). :lol:

The ES7000's provided me with much OS-architecture entertainment in my pre-geocaching days. (Hmm. They might have been under NDA then, so scratch that.) Once you get into Real Computers where everything including memory and CPUs are hotplug, and you have 32 or more processors honking along in the same cabinet, things get really wild.

 

I don't know what you were running before, but in the interest of not turning this into Judge Reinholds demo of the Dominator MX-10s, I'll say the 580's are nice boxes. The off-cache performance is disappointing on them, but that's the point of the 4MB caches that are in vogue now on Xeons. Pair them up with a strong I/O system talking to a RAID cabinet with lots of spindles and a bucketful of memory, and you get hail-mary's for lots of database sins. Ironically, it also makes it harder to observe and measure those sins when everything happens in core instead of on platters because it's faster.

We decided not to buy unpopulated quad-capable machines anymore. We thought of getting them dual-proc to start with and upgrade later for the last SQL Server upgrade we did. Once you got around to populating them, in say, a year, if you looked at the price premium you paid for the chassis, you could take that amount you saved this year and buy next year's fastest at that time next year. And that's in today's money. The rapage on old processors and upgrades is ludicrous. And with data on the SAN, forklifting the data files to the new cluster was simple.

 

Interestingly, I saw an article on this a little while back (http://www.infoworld.com/article/04/08/13/33FEmyth1_1.html?s=feature)

 

So we pretty much stuck to only dual-proc for all servers and retired the quads.

Link to comment
We paid.  We shouldn't have to wait for clogged servers.  Give us our own dedicated machine, with a guaranteed quality of service.  This would easily be worth $50-75 a year. 

Interesting idea. But as others have said, with the large number of premium members, response time could actually be worse.

 

But there's a simple way to handle it. The dedicated server should only be for Charter members. After all, that's a limited number, and it will never get bigger. I think this is a fantastic idea! :lol:

 

After all, we're the ones who put our money in it back when it was in its infancy, and we're the ones who've stuck with it for the past 3+ years.

Edited by Prime Suspect
Link to comment
For those of us who don't use cachemate, could you post a format description so we could write our own file and use your app? The system you have looks incredibly cool, and it fills a big need!

The format is pretty weird as it's what Palm Desktop does for text exports.

 

I suppose I could also accept GPX files with Groundspeak extensions in the way that they come out of pocket queries, such as this:

 

<?xml version="1.0" encoding="utf-8"?>

<gpx xmlns:xsd="http://www.w3.org/2001/XMLSchema" xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance" version="1.0" creator="Groundspeak Pocket Query" xsi:schemaLocation="h

ttp://www.topografix.com/GPX/1/0 http://www.topografix.com/GPX/1/0/gpx.xsd http://www.Groundspeak.com/cache/1/0 http://www.Groundspeak.com/cache/1/0/cache.xsd" xmlns="http://

www.topografix.com/GPX/1/0">

<wpt lat="37.409883" lon="-122.070467">

<name>GC6B7D</name>

<Groundspeak:cache id="27517" available="True" archived="False" xmlns:Groundspeak="http://www.Groundspeak.com/cache/1/0">

<Groundspeak:logs>

<Groundspeak:log id="4839845">

<Groundspeak:date>2004-08-30T07:00:00</Groundspeak:date>

<Groundspeak:type>Write note</Groundspeak:type>

<Groundspeak:finder id="209252">boulter</Groundspeak:finder>

<Groundspeak:text encoded="False">Marky is still evil.</Groundspeak:text>

</Groundspeak:log>

</Groundspeak:logs>

</Groundspeak:cache>

</wpt>

</gpx>

 

Then again, I know of no tool that will export this format with user-entered logs. GSAK?

 

But I'm open to other formats as well if there is a standard way to get data from apps into an exportable log format. CacheMate is the only tool that I know of that is designed to capture logs (and that's what I use), so I focused on that initially.

Link to comment
[

I suppose I could also accept GPX files with Groundspeak extensions in the way that they come out of pocket queries,

Groundspeak GPX is the "obviously correct" answer to that question. I'm still trying to decide if a program that generates it runs afoul of this (and related) joy in http://www.geocaching.com/waypoints/agreement.aspx, but that's actually true of a couple of dozen programs (including my own) that do Groundspeak: extensions to the open GPX speci...

 

· Licensee shall not reverse engineer, decompile, or disassemble the Groundspeak-compatible data format(s) in an attempt to duplicate the proprietary and copyright-protected Groundspeak data model(s) and/or export format(s).

 

Even with that aside, this is a neat front-end. Combine that with a bit of scripting for the Garmins with "geocaching mode" to produce a list of GC#'s in the unit with associated icons of an open treasure chest and you're getting pretty warm.

 

But if you really wanted to make this nifty, bribe ClydeE and Boulter to make the "show me this cache page" link go to the locally rendered data from your PQ to eliminate the page loads, yet still provide you with the text of the cache page to jog your memory so you can make a coherent log.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...