Jump to content

bjornkn

+Premium Members
  • Posts

    6
  • Joined

  • Last visited

Everything posted by bjornkn

  1. Marky and Keystone Approver: Yes, I knew that option was available. But because it serves all your offline needs doesn't mean that it serves everyone else's needs. It doesn't really serve mine, because I'd also like to have the logs. When I started this thread I was hoping for (and expecting) a discussion where we could find ways to decrease the load on the servers so that it could go on providing this nice service in the future. How naive! It ended up with people telling us what we need and want, instead of discussing ways to improve the system so that we can get what we really want without making the servers collapse. Oh well then, let's just let the snowball keep on rolling downhill until it hits the wall or collapses under its own weight.
  2. Marky, it's nice to see that we share the concerns for the well-being of the server. I believe I have said why that scheme doesn't work so well for me - because I find the versatility and power of GSAK combined with Ozi Explorer so much better. My suggestions were also made to help decrease the load on the servers, and I'm sure that being able to just get "what's new" instead of the entire dataset every time would help a lot on the load. What's wrong with that? What is so wrong with suggesting changes that will make the database more useable for more people, while at the same time decrease the load on both database and mail servers? After following this thread I think I understand GrizzlyJohns frustrations better. There seems to be no interest in discussing new and possibly better ways to run this database. It all ends up with the oldtimers telling us how to do it, and to stop complaining. But what if we want to use it in a different way from you? Why don't you want to listen to us? What if we can see it with fresher eyes than you, newbies as we are? I really have no problems with the way it works now, because I can get what I want, although I need to use some workarounds. The problem is that whenever I receive a PQ about 99% of it is data which I already have in my GSAK database. And that is more of a problem for their database and mail servers than it is for me. You may keep on buying new servers all the time to keep up with load, but if I was running that business I would certainly be willing to look at new and less expensive solutions... And then to Mopar: How many caches do you think we need to find before we should be allowed to speak then? Should that be a fixed number or as a precentage of the total number of caches within, say 100 miles from your home? As you apparently did a little research on my statistics, just for fun I did the same on you. I picked one of your, if I may say so, for such a hardcore geocaching pro like you, surprisingly few owned caches, the "Highland Woods, Too", which I assume is pretty close to where you live. Showing the nearest caches ( I didn't bother making a PQ and entering it into GSAK) revealed that within 100 miles you have 3181 caches to find. A similar search, in GSAK, for the same distance from my home shows that there are 97 caches available (in Norway). Apparently you've been geocaching for 3 years now. For someone who has been geocaching for 2 months (I was without a GPS for a month) in such a sparsely populated cache area, and who likes to go hunting for caches while walking his dog (easy to ask) and his son (12 years - and not so easy to ask..), and also likes/needs to do other things than geocaching, I'd say that my statistics aren't that bad? Maybe I'll even get as good as you one day? Maybe then you'll be willing to listen without trying to intimidate me?
  3. Yes, it seems to drift off in the wrong direction. Although it's good to see that I'm not the only one with "bad habits" here, I think the discussion on the future load on the database and how to prevent it from collapsing is very important. I'm sure none of us with "bad habits" really wants to get the entire set of caches we're interested in several times each week. What I want is the ability to be fairly updated offline, and to achieve that I would be quite happy to receive small emails a few times each week with "What's new". If that 500 limit is hard coded, so why not change it? Surely that code has to be changed and recompiled every now and then anyway? Code isn't something that is carved in stone and has to stay like that forever. Some years ago I developed a fairly big database which handled millions of grabbed video images, lots of subscribers, payment info as well as about 30-50k new "logs" every day. Although it worked pretty well I still had to recompile quite often to fix or change something. It wasn't that big a deal to do that. The geocaching database seems to be pretty well organised, although IMO it does show some signs of growing pain. One exampe is how the waypoint names have grown from just a few characters to the current 6, and it will soon have to grow even larger. And why is there always a "GC" at the front? That shouldn't be necessary? But he main thing is how PQs are handled. Just a simple thing as adding a "Last updated during" in addition to the "Placed during" search field would save the servers from a lot of work. I have no idea how many users there are on geocaching.com, or how many logs are inserted each day (there isn't much statistics to find here...), but it would be very interesting to see the growth rates. My guess is that is pretty steep, and growing exponentially. Finally, I couldn't resist jumping in on that respect "thread". If there was a person who started showing disrespect in this thread it has to be Mopar, with his reply to my first ever post on this forum. I'm still glad I never sent my first reply...
  4. It looks like you're more interested in discussing my "bad habits" rather than my suggestions, which would definitely help me improve on my habits. Earlier this summer we had 500 caches in Norway, and now it's 820. One year ago it was about 180. With this growth, which I suppose is not unique to my country, something has to be done to the PQ system and the database IMO, or else it will kneel pretty soon. As Bull Moose writes, why would they allow us to download 17500 caches each week if they wanted us to not keep offline databases? And why should they stop us from using offline databases if that is what we want? I don't know how your setups are, but I'm using an iPaq with a BTGPS, along with TomTom Navigator (for coarse car/bike navigation) and OziCE. I import PQ ZIPs into GSAK, and from there I can export POI files for TomTom and Waypoint files for OziCE as well as HTML pages to be viewed offline on the iPaq. Works very well. When I go geocaching it's almost always on short notice, when time allows. If I should go to geocaching.com and download each cache for a certain area one by one there would be little time left for geocaching with updated info after I have run it through GSAK. New caches pop up every day, but unfortunately most of them are at least 70km away. Now the only way to get them in GPX format right away is to download one by one. You can't get a GPX/PQ on the fly, and you can only download a selection of caches in LOC format. Keeping an updated GSAK database is simply currently the best solution for me. But, as I've tried to emphasize, I don't subscribe to all those PQs because I want to, but because that's currently the only way I can get it the way I want. I want the GSAK functionality, but I also want to get smaller emails - and I want the geocaching database to survive. Why do you all think it is so wrong to change the way things work now? Why is it so wrong if we could get a dayly/weekly "report" with all the new caches, newest logs etc from our selected area(s) instead of the entire set of data each time?
  5. Well, it doesn't help much burying your head in the sand and deny reailty. The reality is that many geocachers (at least here in Norway) who are using GSAK or something similar regulalry download (actually are getting PQs) for the entire country several times a week. As a geo-newbie I also find it very interesting to look at cache descriptions from all over the country. GSAK is much more flexible than the database directly, and also much faster. If many people are using GSAK offline instead of browsing online it would decrease the load on the database servers a lot, and everything would work much faster. If you read my post you'd see that it was essentially suggestions to the database maintainers that could help to decrease the burden on the servers. You never mentioned that issue with a single word. When you run queries it takes much more CPU time to run two similar queries that each returns 500 rows than one single query that returns 1000 rows. Having a "maintenance query" would also remove a huge amount of strain on the servers - I would estimate that it could possibly save 99% if we could get only the newl updated logs instead of the entire database each time. If you think that is a bad idea I don't think we need to discuss this any more. It doesn't matter what the intention of the database was when it was created. It's just like when Bill Gates & co intended DOS to be purely an office tool, and thought that there would never be a need for more than 640KB of RAM. What counts is the present situation - and thinking ahead.
  6. I didn't find any forum group for discussing the services of the geocaching.com database, so I'll try to post this here. Pocket Queries is a very nice "tool" which allows you to keep an offline database, like GSAK, updated all the time. Apparently it must take a lot of CPU power to handle all those queries. But if you changed it slightly I'm sure you would find that it would need a lot less power, save a lot of bandwidth and also be much faster. Take my case as an example: I live in Norway, where there is currently about 800 caches. I use GSAK for maintaining an offline database. Now I need to run at least 3 queries to get all caches updated, which means a lot of CPU power and wasted bandwidth because most of it has been changed anyway since last time it was run. To get around that 500 limit I have to do: Run a query for all caches placed before Jan 1 2004 Run a query for all caches placed between Jan 1 2004 and Aug 10 2004 Run a query for all caches placed during the last month. And I have to do this several times each week to be up-to-date! If there was no limit on the number of caches retrieved, or if the limit was much higher then I could've run one single query for all in Norway. There could be a limit to how many caches you get packed in each zip file though, so that the email attachments don't get too big. A lot of bandwidth and CPU time could also be saved if there was a "maintenance query" that you could subscribe to which would send you only all new logs, all new and all edited cache pages since your last query. Considering the explosive growth of geocaching, at least here in Norway, I have a feeling that the geocaching.com database will soon face severe problems keeping up with the load. It's already quite slow at times...
×
×
  • Create New...