+GR8Caches Posted February 21, 2009 Posted February 21, 2009 Sorry if this has been posted in the past, my searches resulted in nothing that helped answer this question. I want to show statics on all of my own caches using GSAK but I have not downloaded my cache information on a regular basis to have all the logs captured. Is there a way to get this back data? I have created a pocket query but it only goes back a few logs. I have also tried doing a GPS exchange file from each of the caches but I still have missing logs. I have even tried expanding the listing to show all logs and then running the exchange file with no better results.
+Lil Devil Posted February 21, 2009 Posted February 21, 2009 Bottom of the Pocket Query page. Look for "My Finds" and "add to queue."
AZcachemeister Posted February 21, 2009 Posted February 21, 2009 The AddLogs macro is what you are looking for!
+The Leprechauns Posted February 21, 2009 Posted February 21, 2009 A regular pocket query only returns the last five logs, plus any logs by the query's owner. A GPX file downloaded from the cache page only returns the last 20 logs. A "My Finds" pocket query only returns logs by the query's owner. That leaves the screenscraping macro as the only remaining alternative of which I'm aware.
+pppingme Posted February 21, 2009 Posted February 21, 2009 That leaves the screenscraping macro as the only remaining alternative of which I'm aware. Screenscraping is usually though of in an automated non-interactive fashion. I don't think most people would consider saving the html (in an interactive action) off one or two pages to be screen scraping, this is what the user wants and is what the suggested macro does.
+Glenn Posted February 22, 2009 Posted February 22, 2009 That leaves the screenscraping macro as the only remaining alternative of which I'm aware. Screenscraping is usually though of in an automated non-interactive fashion. I don't think most people would consider saving the html (in an interactive action) off one or two pages to be screen scraping, this is what the user wants and is what the suggested macro does. I prefer the term webscraping because screenscraping refers to extracting information for anything that can display to a computer screen and not limited to just the web browser. But screenscraping has become the popular usage. Anyhoo... I think most people would consider extracting info manually via a third party program even from a few html pages as screenscraping because that is the definition of screenscraping. You may be thinking of spidering or crawling. Which is using a program to automatically follow HTML links on webpages. A poorly written spider which follows more links that what is really needed and follows the same link more than one combined with a webscraper that requests the entire webpage each time can mimic a DOS (Denial Of Service) attack. If a poorly written spider-scraping tool is made available for download then it could easily mimic a DDOS attack (Distributed Denial of Service). If you think geocaching.com is slow on the weekends now it would impossible to access if spiders or crawlers were allowed to be used. All that being said I don't see a problem with screenscraping. However those at Groundspeak have often voiced concerns about using old and out of date information for finding caches and sharing premium member information with those who aren't premium members.
Recommended Posts