Jump to content

Another Question About The Slow Website


Cheminer Will

Recommended Posts

I know this gets talked about a lot here, but I have never really understood the reasons and if I did, I would probably not get so frustrated. I understand there are times that the site has lots of visitors. But I visit many sites that I think have more traffic than geocaching.com and never have problems like the "Server too busy" errors or painfully slow loading pages. Also this has been a periodic problem at geocaching.com for a long time.

 

Like many users of this site, I have no knowledge about website construction and databases. I think we would be less frustrated if we knew exactly why this problem can't be fixed. Just to be told that too many people are on the site and to use the site on another day is exasperating. Is it too expensive to fix? Is it a technology problem? Is it the nature of the information being accessed on the site?

Link to comment

Yo this is something I can't dig!!! I am new to the geocaching world and I really dig it. Dug it so much I got a premium membership, but the web site needs to get corrected. Time out requests and sql errors don't take much to fix, either the bandwidth connection is too slow or it is poorly written sql. Please fix!

Link to comment

It usually starts on friday and goes to sunday evening, it happens every year after the holidays, as new cachers find out about geocaching. A lot of times I'll wait until later in the week to log my finds, makes it a lot less frustrating that way

 

Although this is certainly true, it's still frustrating, even for long-time cachers. As a teacher, my evenings/weeknights are often taken up with meetings, grading papers, lesson plans, directing play practices, etc., and allow little time for caching/logging finds. I truly appreciate gc.com, but surely the weekend situation could be improved. Shouldn't the workaround be provided by the service instead of those who pay for membership?

Link to comment

It usually starts on friday and goes to sunday evening, it happens every year after the holidays, as new cachers find out about geocaching. A lot of times I'll wait until later in the week to log my finds, makes it a lot less frustrating that way

 

Although this is certainly true, it's still frustrating, even for long-time cachers. As a teacher, my evenings/weeknights are often taken up with meetings, grading papers, lesson plans, directing play practices, etc., and allow little time for caching/logging finds. I truly appreciate gc.com, but surely the weekend situation could be improved. Shouldn't the workaround be provided by the service instead of those who pay for membership?

 

This was sort of the point of my first post. I am not so much interested in fostering continuing complaints as I am in someone in the know explaining why this can not be fixed. Other websites have as much or more traffic than geocaching.com and don't have the slowness. I know I can wait until Monday to do my business on geocaching.com, I just want to know why after a year or more of this, I still have to.

Edited by Cheminer Will
Link to comment

This was sort of the point of my first post. I am not so much interested in fostering continuing complaints as I am in someone in the know explaining why this can not be fixed. Other websites have as much or more traffic than geocaching.com and don't have the slowness. I know I can wait until Monday to do my business on geocaching.com, I just want to know why after a year or more of this, I still have to.

 

I've indicated many times in this forum section the reason why the slowness is so great during the weekends on Geocaching.com. At this point in time there is a log entry a second being entered into the database. I'm sure that if the site didn't have the slowdowns you could expect them to be rolling in even faster.

 

I doubt there are many other web sites that see this kind of traffic - at least the unique situation we have when everyone logs their finds at the same time. Usually sites get some increased levels during different timeframes but I doubt they get the kind of dogpiling as this site gets.

 

We do essentially beat the problem year to year but we also have been doubling traffic from year to year. So we do slip behind in the early months and, after throwing more hardware and offloading tricks, get it set up for the last year of traffic. The problem is at the end of the day we end up having far more hardware and speed than we really need if you balanced it all out - not to mention how expensive it is to build up hardware to handle the busy hours when people are queuing up for log entries.

 

We've thrown a lot of SQL prowess into choosing how the site runs, but we do have to step it up a notch. We'll do the best we can with the limited financial resources we have (which, thanks by the way to all the premium members who make this site run at all), but please be patient as we continue to add more hamsters and hamster feed.

Link to comment

Thanks, this is the sort of information I was looking for.

 

At this point in time there is a log entry a second being entered into the database.

 

And this is really something else! One per second, wow. How long does this level go on for? It would be fun to see exactly how many logs are entered on a Sunday from say, noon to midnight.

Link to comment
Server Application Unavailable

 

The web application you are attempting to access on this web server is currently unavailable. Please hit the "Refresh" button in your web browser to retry your request.

 

Administrator Note: An error message detailing the cause of this specific request failure can be found in the application event log of the web server. Please review this log entry to discover what caused this error to occur.

 

Did it fail all together? I've been terrably disappointed with Microsoft software through the years and have personally preferred Linux and other higher quality stuff that wasn't written as if it was for toy applications. I always wonder if that isn't the main problem at gc.com?

 

Thanks for the site and the work supporting and creating the geocaching activity, Jeremy.

 

Nudecacher

Link to comment

...We've thrown a lot of SQL prowess into choosing how the site runs, but we do have to step it up a notch. We'll do the best we can with the limited financial resources we have (which, thanks by the way to all the premium members who make this site run at all), but please be patient as we continue to add more hamsters and hamster feed.

 

Thanks for the explanation again, Jeremy, for those of us with short memories. At the risk of offending a few fellow forumites, I know I'd be willing to pay more for the premium membership if it would help provide more hamster feed. Managed to log a few finds from the trails today, so I'm still a happy cacher, even with a Sunday night server.

Link to comment

One log per second would be a lot if a secretary were filing things, but not a server.

Remember that a typical log is less than 1K in size. It is nearly insignificantly small.

Even an 10-year old server can deal with thousands of logs of that size per second.

So one log per second is not the cause of the problem; it is the result of the problem.

Link to comment

One log per second would be a lot if a secretary were filing things, but not a server.

Remember that a typical log is less than 1K in size. It is nearly insignificantly small.

Even an 10-year old server can deal with thousands of logs of that size per second.

So one log per second is not the cause of the problem; it is the result of the problem.

When my secretary puts a document in a file folder, it's one document and one folder. GC.com is a complex relational database. I was thinking about this when I wrote a "found it" log and dropped off three geocoins that I own into a cache that's on my watchlist. That "one log" has a ripple effect:

 

It updated the cache page to show my log and the trackables.

It updated my personal stats... so when I look at "My Account," my public profile, a search of my finds, etc., that log shows up.

It updated each of the three trackables' history pages, and the trackable history page for the cache.

It updated the Pennsylvania state page to reflect these three as the most recent drops.

It updated the master travel bug page to reflect these three as the most recent drops.

It sent out e-mails to everyone on the watchlist or a bookmark list for the cache page, or the watchlist for any of the trackables, as well as to the owners of the cache page and each trackable.

 

Find me a secretary who can do all that in a second.

Link to comment
It updated the cache page to show my log and the trackables.

It updated my personal stats... so when I look at "My Account," my public profile, a search of my finds, etc., that log shows up.

It updated each of the three trackables' history pages, and the trackable history page for the cache.

It updated the Pennsylvania state page to reflect these three as the most recent drops.

It updated the master travel bug page to reflect these three as the most recent drops.

It sent out e-mails to everyone on the watchlist or a bookmark list for the cache page, or the watchlist for any of the trackables, as well as to the owners of the cache page and each trackable.

 

Please tell me this is not an accurate synopsis of what really happens.

Link to comment
At the risk of offending a few fellow forumites, I know I'd be willing to pay more for the premium membership if it would help provide more hamster feed.

 

While there is no offense, but I'm not willing to throw more money at the problem. There have been quite a few suggestions by folks a lot smarter than I, and I'm thinking who work in the industry, that would be only programming and usability changes that would speed up the site. Throwing money at Groundspeak is like throwing money at the education system, until there is a fundamental change in how it is operated the only thing you get is more infrastructure and the end of the day you're right where you started.

Link to comment
It updated the cache page to show my log and the trackables.

It updated my personal stats... so when I look at "My Account," my public profile, a search of my finds, etc., that log shows up.

It updated each of the three trackables' history pages, and the trackable history page for the cache.

It updated the Pennsylvania state page to reflect these three as the most recent drops.

It updated the master travel bug page to reflect these three as the most recent drops.

It sent out e-mails to everyone on the watchlist or a bookmark list for the cache page, or the watchlist for any of the trackables, as well as to the owners of the cache page and each trackable.

 

Please tell me this is not an accurate synopsis of what really happens.

:D No, there's a healthy bit of hyperbole in there, of course. A record is added to a database. When someone calls up any of the other pages described above, the page is generated and must reflect that new record. And a *lot* of pages get generated.

 

Mainly I just need a new secretary. Interviewing one today and one tomorrow.

Link to comment

It usually starts on friday and goes to sunday evening, it happens every year after the holidays, as new cachers find out about geocaching. A lot of times I'll wait until later in the week to log my finds, makes it a lot less frustrating that way

 

Do you know when in the year it starts to "calm down"? I started last August and only noticed this problem after the holidays, so what you say makes sense. Obviously it's sometime before August but I'm just curious when, thanks.

Link to comment

A record is added to a database. When someone calls up any of the other pages described above, the page is generated and must reflect that new record. And a *lot* of pages get generated.

 

True... and in some cases (when it's the HTML-generating bit which is being slow), I find that I can log quickly with wap.geocaching.com (it works from a regular Web browser, not just a phone) and then come back later and edit it for more detail. Good for getting travel bugs right when you know that the TB has probably already moved on, and/or for getting the first log so your FTF claim looks more plausible. :D (OTOH, the watchers only get a rather cryptic e-mail: "WAP log, more later").

 

However, when it's the database server itself being slow, the WAP logs have to wait their turn like everyone else. I've also seen WAP submissions say "the submission was unsuccessful", then retry it, only to find that the first one was successful and I've logged two finds.

 

I wonder if some kind of "light" display mode, without icons/frames (but what about the adverts which pay for everything?) might help. And/or maybe deferred, or asynchronous, processing of the watchlist/owner e-mails?

Link to comment

Throwing money at Groundspeak is like throwing money at the education system, until there is a fundamental change in how it is operated the only thing you get is more infrastructure and the end of the day you're right where you started.

Perhaps Groundpeak can offer you a voucher and you can use it to start your own Geocaching site? :D

Link to comment

Throwing money at Groundspeak is like throwing money at the education system, until there is a fundamental change in how it is operated the only thing you get is more infrastructure and the end of the day you're right where you started.

Perhaps Groundpeak can offer you a voucher and you can use it to start your own Geocaching site? :D

 

:D Oh, how original. I never heard that one before. :D Good one.

Link to comment

I would like to see a new (premium) feature to do offline logs: An extended version of GSAK could be used to enter all the logs of a happy cache day and then be uploaded as a single GPX file to a GC.com queue. There it gets batch processed as soon as the server load allows for this.

 

I don't care if my logs show up only after hours (or days) in the batch queue. But I do care to get done with my logging work right after coming home. We have Pocket Queries, why not add Pocket Logs?

Link to comment

I would like to see a new (premium) feature to do offline logs: An extended version of GSAK could be used to enter all the logs of a happy cache day and then be uploaded as a single GPX file to a GC.com queue. There it gets batch processed as soon as the server load allows for this.

 

I don't care if my logs show up only after hours (or days) in the batch queue. But I do care to get done with my logging work right after coming home. We have Pocket Queries, why not add Pocket Logs?

 

I've consider making the same request when this issue has come up before. Great idea :D

Edited by tozainamboku
Link to comment

There have been quite a few suggestions by folks a lot smarter than I, and I'm thinking who work in the industry, that would be only programming and usability changes that would speed up the site.

 

You keep referencing these but I don't recall any that were particularly helpful or something we already considered and dismissed. And no, "switch to Linux" is neither a valid or useful option.

Link to comment

Throwing money at Groundspeak is like throwing money at the education system, until there is a fundamental change in how it is operated the only thing you get is more infrastructure and the end of the day you're right where you started.

Perhaps Groundpeak can offer you a voucher and you can use it to start your own Geocaching site? :D

 

:D Oh, how original. I never heard that one before. :D Good one.

And yet, with all those folks saying it to you, you never listen. :D

Link to comment

I've consider making the same request when this issue has come up before. Great idea :D

 

We're building an API to do this.

 

VERY nice! Being an API I'm assuming that GSAK and other apps would be able to tie into it as well (if the TOS doesn't get in the way). How sweet would that be?

Edited by Semper Questio
Link to comment

That's right. The model is if you have a premium membership you can use the API features through other software applications. It may have some throttling on the service (like we do with Google Earth) but it should be sufficient for your caching needs.

Link to comment

There have been quite a few suggestions by folks a lot smarter than I, and I'm thinking who work in the industry, that would be only programming and usability changes that would speed up the site.

 

You keep referencing these but I don't recall any that were particularly helpful or something we already considered and dismissed. And no, "switch to Linux" is neither a valid or useful option.

 

I only bring these up because it appears others have gotten the message you're going to do it your way and they got tired of wasting their breath. I guess I should do the same, but I don't particularly care to sit by and let folks think it is merely a hardware issue.

 

To tell you the truth I'd like to eat crow when I say I doubt any API or a significant software solution is less than 18 months away. Heck, I'm still waiting for the Rock-n-Roll geocaching.com II, or whatever you called it, which is, what, 2 years over due.

Link to comment

Throwing money at Groundspeak is like throwing money at the education system, until there is a fundamental change in how it is operated the only thing you get is more infrastructure and the end of the day you're right where you started.

Perhaps Groundpeak can offer you a voucher and you can use it to start your own Geocaching site? :rolleyes:

 

:P Oh, how original. I never heard that one before. :P Good one.

And yet, with all those folks saying it to you, you never listen. :P

 

What, and leave my favorite grumpy reviewer behind? No thanks. I'll stay here and keep putting my two cents where I think they need to be spent. :D

Link to comment

That's right. The model is if you have a premium membership you can use the API features through other software applications. It may have some throttling on the service (like we do with Google Earth) but it should be sufficient for your caching needs.

 

SWEET! It'll be ready tomorrow, right!? :P:rolleyes:

 

This will be a great feature! Logging is the main time I get frustrated about the site. Most of my other time sensitive cache information chores get done by GSAK whenever I need it.

 

Now come the "timetable for this feature" questions............

Semper Questio is usually pretty accurate in posts. Tomorrow sound good to me! :P

Edited by Cheminer Will
Link to comment

And yet, with all those folks saying it to you, you never listen. :P

What, and leave my favorite grumpy reviewer behind? No thanks. I'll stay here and keep putting my two cents where I think they need to be spent. :P

Keystone will be happy to hear that. :rolleyes::P

Darn, I had $3.00 in my pocket ready to go in the mail to buy a premium membership on your site!

 

I usually don't review caches past midday on Sunday to lighten the load. We draw a lot of resources from the site when we review caches (looking at nearby caches, looking at GC mapping, etc.).

Link to comment

 

I only bring these up because it appears others have gotten the message you're going to do it your way and they got tired of wasting their breath. I guess I should do the same, but I don't particularly care to sit by and let folks think it is merely a hardware issue.

 

 

I don't think I indicated it was merely a hardware issue. I mentioned "offloading tricks" but didn't dwell on the many other programming tricks we do to make the site run faster. You don't have any suggestions because you're not a programmer, so I don't expect you to understand how to run a high-traffic web site. Nor do I expect most programmers to understand it either.

 

No offense but you tend to offer your opinions with some kind of knowledge and authority on how things work. I'd suggest you should give up since you don't understand the subject matter.

 

To tell you the truth I'd like to eat crow when I say I doubt any API or a significant software solution is less than 18 months away. Heck, I'm still waiting for the Rock-n-Roll geocaching.com II, or whatever you called it, which is, what, 2 years over due.

 

If this site kicked out gold bars you'd complain they weren't 24 karat.

Link to comment

 

I only bring these up because it appears others have gotten the message you're going to do it your way and they got tired of wasting their breath. I guess I should do the same, but I don't particularly care to sit by and let folks think it is merely a hardware issue.

 

 

I don't think I indicated it was merely a hardware issue. I mentioned "offloading tricks" but didn't dwell on the many other programming tricks we do to make the site run faster. You don't have any suggestions because you're not a programmer, so I don't expect you to understand how to run a high-traffic web site. Nor do I expect most programmers to understand it either.

 

No offense but you tend to offer your opinions with some kind of knowledge and authority on how things work. I'd suggest you should give up since you don't understand the subject matter.

 

To tell you the truth I'd like to eat crow when I say I doubt any API or a significant software solution is less than 18 months away. Heck, I'm still waiting for the Rock-n-Roll geocaching.com II, or whatever you called it, which is, what, 2 years over due.

 

If this site kicked out gold bars you'd complain they weren't 24 karat.

:rolleyes:

Link to comment

You don't have any suggestions because you're not a programmer, so I don't expect you to understand how to run a high-traffic web site. Nor do I expect most programmers to understand it either.

 

Well I am a programmer and I do develop high-traffic web sites, which really doesn't mean squat :rolleyes:. I would be happy to assist in any way that you an Raine could use any help with improving the performance of the web site. I know you want developers in your offices but you could shoot out code to developers for peer reviews so we could make suggestions based upon how we develop our web sites.

Link to comment

The major issue we have at the moment is SQL - we need an MSSQL developer that understands the inner workings of SQL - right down to the low level behind the scenes work that SQL is doing for each query. I'm sure we can knock off some of the inefficiencies in the queries right there but it will involve some specialized person to dive into it.

Link to comment

The major issue we have at the moment is SQL - we need an MSSQL developer that understands the inner workings of SQL - right down to the low level behind the scenes work that SQL is doing for each query. I'm sure we can knock off some of the inefficiencies in the queries right there but it will involve some specialized person to dive into it.

You want Kimberly Tripp - she specializes in high availability - from both a DR and performance perspective. If you've gone through all her online materials and haven't made an impact on your code, I'd be suprised. You've looked at your OS performance monitors like the disk queue length etc, too? And indexing using the profiler? Just looking at the execution plans of your top queries can usually spot the most egregious bottlenecks. You can get a lot of her papers and seminars online and she has also appeared on DNR podcasts, too.

Link to comment

If this site kicked out gold bars you'd complain they weren't 24 karat.

 

Nah, a more accurate analogy would be you saying the site would be kicking out gold bars and your users would have to wait 2 years before they saw it.

 

Something else, I've not complained in the above thread. I started by pointing out I'm not willing to fork out more money for more hardware--which is a common offer by those who don't know all of the issues. Not that I'm an expert, far from it. I do know not all problems can be solved via software. It appears as though the only thing you are looking at is either adding hardware or making the present system more efficient. Obviously niether are working very well because we're right back into the cycle of slow servers again.

 

Yes, I understand there are more users, but I would have thought three or four of these cycles would have taught someone that what they are doing is not working.

 

Using a que to serve PQs was brilliant and exihibited foresight. Why hasn't something for the logs been implemented before now?

 

Why force the shutdown of previous "speed loggers?" Why force someone to call up one of your pages in order to log.

 

Why not program an "expert mode" and allow in-the-know users skip steps in logging? Wouldn't it be trivial to program a "Log Me" button next to a cache listing on the Nearest Cache page? Wouldn't that skip a page that loads not just the cache page but hits the TB tables, logs, and all sorts of other data not needed to log a cache?

 

That's just a few suggestions that would remove some of the load from the servers. I'm positive there are other suggestions that were given that I'm not recalling at the moment.

 

No, I fully admit I'm not an expert nor do I run a high volume site, but I do see usability issues that would certainly help.

 

Somebody mentioned to me a while back that it almost seems as though these issues are intentional in order to garner the ever present "I'll be happy to pay more money to fix the problems" when the problem isn't really money.

Link to comment

I'll pay CR's extra cash and mine. I think for what this site gives most of us, and the happiness it gives most of us that we may or may not have previously had, we may be underpaying for the bandwith that we blow. Personally, I would have no problem giving double or more of what the premium membership is now. $5 of $10 dollars a month would seem fair to me. Heck my girlfriends account is bumped to premium because of me and she has no need for the perks of Premi membership. I did it just to give to the cause. Just my 2¢ from 5¢.

Link to comment

The major issue we have at the moment is SQL

 

As an outsider looking in here's some things that pop into mind for you guys to consider (you may be doing some of these things already).

 

1 - Store connection string in web.config and use it for every connection. Even the slightest difference in the connection string info causes a new connection to be created instead of using a cached connection.

 

2 - Use datareaders where possible instead of datasets.

 

3 - Use SQLHelper class library for data access. This allows you get a data reader with a single line of code instead of created Connection and Command objects.

 

4 - If you're using Connection and Command objects are you opening as late as possible in the code and closing as soon as possible? Do you include the close connection option when executing data readers?

 

5 - All database access must go through stored procedures.

 

6 - Use transactions. I know they are more expensive in terms of processing but it would eliminate all of the multiple logs to drop travel bugs and multiple logs to retrieve them.

 

7 - Migrate to ASP.Net 2.0 and use asynchronous pages. (This is something I'm looking into now for our projects).

 

I'd be happy to help in anyway I can. I'll sign an NDA if you want.

Link to comment

1 - Store connection string in web.config and use it for every connection. Even the slightest difference in the connection string info causes a new connection to be created instead of using a cached connection.

deadhorse.gif

 

That same tired old "store the connection string in web.config" argument that we hear over and over and over. Just once I'd like to see a discussion of site performance without seeing Mr. web.config rearing its ugly head again. Back in 2002 when I was a n00b, it was fun to watch the flames fly and eat popcorn. I was innocent, and easily entertained. Eventually, I joined the "pro-web.config" crowd, took a public stance, and found out just how dark an abyss I had sunk into. The hate mail started shortly after that first post I made, and life has never been quite the same since then.

 

web.config has sucked the fun out of geocaching.

Link to comment

The major issue we have at the moment is SQL

 

...

 

6 - Use transactions. I know they are more expensive in terms of processing but it would eliminate all of the multiple logs to drop travel bugs and multiple logs to retrieve them.

 

7 - Migrate to ASP.Net 2.0 and use asynchronous pages. (This is something I'm looking into now for our projects).

 

I'd be happy to help in anyway I can. I'll sign an NDA if you want.

 

With exception of 6 and 7 we've done the rest. AJAX style pages alienate far too many geocachers to be of any use at this point. And 6 wouldn't do a whole lot when it comes to saving processing power which is the bottleneck at the moment. We need a very experienced DBA to walk through the code with us and find better indexing and table structures to make the site work faster.

 

caderoux - I believe Elias has already contacted Kimberly Tripp to help us though she gets tied up a bit in conference circuits. Elias and I went to her talks at the last Connections conference. We also have some emails into some consulting groups. We'll figure it out.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...