Jump to content

Rss / Atom Feed For Geocaches??


nafai

Recommended Posts

It would be very nice if geocaching.com created an RSS feed for caches. I don't know what sort of technology you are using for your backend database, but I've yet to see a system that didn't allow for a fairly simple addition of automated RSS / XML creation along with the web page.

 

Is this planned for the future perhaps?

Link to comment
It would be very nice if geocaching.com created an RSS feed for caches. I don't know what sort of technology you are using for your backend database, but I've yet to see a system that didn't allow for a fairly simple addition of automated RSS / XML creation along with the web page.

 

Is this planned for the future perhaps?

WTF??? can you put that in "people who have no clue what you mean" terms lol

Link to comment

Not really. First, a pocket query is only usable by premium members. As the descriptor page indicates, they are also fairly processor or bandwidth heavy, and they are run on a schedule.

 

RSS feeds use XML, which is a neutral format that can be used by developers for an infinite number of uses.

 

Essentially, I'm making a Dashboard Widget (http://www.apple.com/macosx/features/dashboard/) that will show Caches in your area by zipcode. If an RSS XML feed was provided, this would not only make this task much easier, but it would also be much more friendly for the server than the implementation I'm currently going to have to use: site scraping, which will essentially load the entire page, graphics, menus, extraneous text and all of http://www.geocaching.com/seek/nearest.asp...p=XXXXX&x=0&y=0

 

That's a 100k hit to the server every time it's invoked, versus about 5k for the average RSS XML feed file.

Link to comment
I'm currently going to have to use: site scraping, which will essentially load the entire page, graphics, menus, extraneous text and all of http://www.geocaching.com/seek/nearest.asp...p=XXXXX&x=0&y=0

 

Be sure you read the TOU here:

 

TOU

 

Make special note of:

 

You agree that you will not use any robot, spider, scraper or other automated means to access the Site for any purpose without our express written permission.
Link to comment

> Not really. First, a pocket query is only usable by premium members.

 

If you're considering an RSS feed for non-premium members, I don't see it happening because it would undermine the site's income.

 

> As the descriptor page indicates, they are also fairly processor or

> bandwidth heavy, and they are run on a schedule.

 

But wouldn't your tool require the same kind of queries? Unless you're planning on retriving all the data and then processing on another server, I don't see how you can avoid hitting the geocaching.com database for queries.

 

> RSS feeds use XML, which is a neutral format that can be used by

> developers for an infinite number of uses.

 

That would be great but I doubt it will come about any time soon due to its threat to the site's income.

 

Good luck with your project, though.

 

GeoBC

Link to comment

I see two possibilities for an RSS feed, neither of which make any sense.

 

1. A feed that shows all new caches as they are approved. This would be of negligible benefit as most caches would not be anywhere near you.

 

2. A customizable feed that shows only the new caches near you. Well, we already have that: Pocket Queries. In case you've never looked at a GPX file, its simply XML with a custom namespace. Open it up in a XML viewer and there is all your info. Better yet, open it in something like Watcher that understands GPX and shows the data in the proper context.

 

So what would a RSS feed give you that Pocket Queries don't? :rolleyes:

Edited by Lil Devil
Link to comment

If Pocket Queries are the sole feature you're using to sell Premium Memberships, then allow me politely to draw the analogy between this site, the RIAA, and the early 20th century railroad industry.

 

Technology changes, and people's expectations of how things work changes. Businesses that change with it and figure out how to remain profitable and renew themselves succeed. Others blame technology or try to block said progress, instead of blaming their poor business model.

 

RSS syndication is here to stay. Scraping, whether it's with javascript, or cURL commands through PHP, or even XML hijacking with ASP, are also here to stay. People are interested in having the bits and pieces of many things that they like all in one place. These technologies give them that. And as more web browsers begin integrating it, people will become accustomed to having that type of content delivery and notification.

 

As for the site scraping, do you find and block non-members who from 9-5 run a search with their zipcode twice per hour? If I, as Joe Schmoe cause 16 page hits, do you prevent me from accessing your website in the future? I'm guessing no, and if you think that people aren't already using site scraping to mine for the data that they are personally interested in, you're fooling yourself.

 

As a content provider myself, I have no desire to break your TOU. I will cease development via the scraper, and will pay for, and receive the Pocket Queries, which I did not realize were XML.

 

I sincerely hope, however, that if nothing else, I have caused some questions to be raised. You can blame, discourage, and get upset at these technologies that allow people to access your site's data in manners that you don't intend, and get swallowed up as people do it anyway, or you can look at the communicated and tangible value of your Premium Memberships, which at present, are severely lacking. But I'm sure your server statistics show you how many people hit this site regularly and are never converted over to paying members.

Link to comment
Technology changes, and people's expectations of how things work changes. Businesses that change with it and figure out how to remain profitable and renew themselves succeed. Others blame technology or try to block said progress, instead of blaming their poor business model.

Wow!! Not enough sugar on your corn flakes??

 

I pay for the service to keep it running, not because GC.com offers anything spectacular for paying members. (they do - pocket queries are cool).

 

You have fallen in love with your expectations of technology but that doesn't make them true.

 

I too like the bits of information found on GC.com web pages - that is why I look at the web pages without feeling the great need to write software to rip the page into tiny pieces of information. Your complaint seems centered on the data not being provided in a format you like - ok great, I can respect that but remember it isn't your data.

 

Seems to me that a poor business model would be to "give" away all of your business data.

Link to comment
I pay for the service to keep it running, not because GC.com offers anything spectacular for paying members. (they do - pocket queries are cool).

 

...

 

Your complaint seems centered on the data not being provided in a format you like - ok great, I can respect that but remember it isn't your data.

 

Seems to me that a poor business model would be to "give" away all of your business data.

 

You do? Just to keep it running, you'd pay? Are you the type of person they are targetting with their subscription model? If so, and I doubt it, they're doing you, and themselves, a disservice by setting their membership price so low, or at least not offering different tiers. Sites with hardcore fans that use a donation model quickly find that people like you will gladly part with more than $3 a month to help keep something they love alive. If they are targetting people such as yourself, than a subscription model is the wrong choice.

 

As for the format of the data: it *is* being provided in the manner I want. XML is all that's needed, and as I mentioned, was unaware that Pocket Queries were in that format.

 

And no, I don't believe that "giving away" all of your business data is the way to go. The subscription page mentions: "we need to move to a subscription basis for newer features for the web site." Ok...well, how about listing some of these said features, or things that are on the table for rolling out in the future? Is it limited to scripted queries emailed to me, and access to members-only caches? If it is, that's fine too, but you have to communicate that value to potential customers.

 

Heck, I'd even say to limit the free participation back from what it is at present. To me, tracking my individual caching progress is the most valuable service this site offers. It has a tangible benefit that I don't have to connect any dots as to why it would need to be charged for. As Joe User, it certainly feels more "premium" to me than the server typing in my zipcode for me 5 times a day (oversimplification, yes, but you get my point).

 

All I'm saying is that the present premium membership isn't communicating its value very well; if PQs are the Bees Knees, then it needs some PR work, and definitely needs to be made more prominent on the site. A tagline listing the features on the subscription page doesn't cut it. Customer don't by features, they buy benefits. "It's like having the geocaching.com website on your PDA?" Hello? AvantGo? Don't shoot the messenger.

 

But this is terribly off topic. I wanted an XML file of queried data, and I have it. Feeds still have an advantage that would make PQs more marketable; they could be accessed via any internet capable device, and not limited to scheduled emails.

Link to comment

Why is it the people that use this site to find hundreds, even thousands of caches generally have no problem with the way it works, and all the people that know a better way to do things only find 1-2 caches a year?

 

PS: Before you ask, sharing that PQ data also violates the TOU.

Link to comment

What are you talking about? I've had plenty of ideas and they don't go anywhere.

 

I think the RSS idea is a great one.

 

While I don't think it would go anywhere, here's an idea on one use. We'll take the recent cache approvals example. I use something similar on our site. However, I dedicate a single daily PQ to produce the caches placed in the last 7 days in South Carolina. I get the PQ and massage it through a batch file which convertes it and then up loads it to the site. The script then takes it from there.

 

However, it is generated only once a day.

 

This site could create a similar scheme. Static files based on predetermined regions, be it a state, part of a state, x distance from a city, whatever. The file is generated each time a cache is approved, not on the fly. Once a day, it is completely re-generated to catch name changes or what have you.

 

This RSS could then be plugged into something like your Yahoo! homepage, a regional geocaching site, or read by a newsreader.

 

Because it is updated on each new cache instead of daily, the data is more timely. A FTF hound could be watching a static file that only hits the websevers instead of watching the nearest page with its database hits.

 

That's just one.

Link to comment
Why is it the people that use this site to find hundreds, even thousands of caches generally have no problem with the way it works, and all the people that know a better way to do things only find 1-2 caches a year?

Yes, it couldn't possibly be that there are hundreds if not thousands of people who find caches regularly just don't bother to create and update a registered user. The site doesn't require it, and many people like to preserve their anonymity online unless absolutely necessary to do otherwise. Don't assume because on someone's profile they don't show many caches that they aren't avid geocachers.

 

But for the record, it's sad to see that this community is no different from other niche hobbiest communities. New people and new ideas scorned by old-timers, as if a new person or a new way of enjoying this hobby or this site somehow harms them personally. <rolls eyes>

 

I guess people pick and choose what they want to pay attention to.

 

>> PS: Before you ask, sharing that PQ data also violates the TOU.

 

"I have no desire to break your TOU..." If I develop a Dashboard Widget using PQ data, you will have to be a premium member capable of receiving your own PQs. Seriously, I'm being reasonable here. The kneejerk reactions are disconcerting.

 

Btw, a major flaw in the premium member section: people here had to alert me to the fact that it even existed. I've been using this site for some time and never come across it. Not only is the choice not heavily attractive for users looking at the option, but it's a Geo Hunt just to discover that this site *has* premium content, and to then find out what that premium content is!

Edited by nafai
Link to comment

From the front page of Geocaching.com:

 

Create or Upgrade your Membership

 

A basic membership is free! If you create an account on geocaching.com we'll let you know when new caches are hidden in your area. You'll also have the ability to log your finds online to share your experience with the rest of the community.

Create an account now.

 

You can also receive additional features by upgrading your membership. Help support Geocaching.com and get access to additional features like advanced mapping and search options through Pocket Queries.

Become a Premium Member today.

 

Premium member ads also run in the small ad bar on the left side of each and every cache page. It's pretty darn hard to feature something any more prominently than on the main homepage and on each and every cache page. You want popup ads?

Edited by Keystone Approver
Link to comment
Premium member ads also run in the small ad bar on the left side of each and every cache page.  It's pretty darn hard to feature something any more prominently than on the main homepage and on each and every cache page.  You want popup ads?

There's so much text on the mainpage, I'd hardly call that prominent. It certainly does nothing visually to pull it from the rest of the articles. It's styled the same, using identical headlines, fonts, coloring, etc. as the featured articles, and what's worse: it's below the crease.

 

Heavens no you don't need popups, but browse around to even a handful of other subscription based sites if you'd like to see better, nonintrusive ways to do it. The most logical would be to put it in the header login label (Header1_lblLogin). On one's first visit to the site, that spot is the most consistent in location and use for logging in or creating a new account with the majority of the web.

 

Here is where the problems start. On the login / account creation page, there is ZERO mention of subscription content and two different types of membership. This should be the very first thing on this page, I don't know (sarcasm), maybe under "Why create a Geocaching account?". This is where any user can be expected to be presented with such a thing, and will not be perceived as a nuisance of any sort.

 

Then, once logged in, you have to navigate to your account and see on the righthand side a brief mention of your membership status. Even something as simple as a bold color instead of bold black would call that to greater attention. Regardless, the Header Login label should have a bright red "Upgrade to Premium" link next to your name if you're a regular member.

 

As for the member ads on the small ad bar? I've never seen them. I see ads for license plates, keychains, and tee-shirts, but nothing about premium membership. I have just tried 5 caches, both logged in and logged out, refreshing quite a few times to see if I've missed a trigger. To eliminate any browser incompatibilities, I did a text search on the html source for both "premium" and "member" and found nothing. Methinks they aren't working like you want them to, cap'n.

 

The subscriber-only caches are also poorly marked. There's a little icon next to them, but not only does that icon not communicate that it's subscriber-only until after you figure that out on your own, but your Cache Types page doesn't even SHOW or define that such an indicator exists!

 

Seriously, if you go through your site from a perspective of someone without any knowlegde or expectations on how you've designed it, you'll quickly see that you are rather hiding the existence of your subscription content. It doesn't have to be a popup window to be obvious, and it doesn't have to be buried to remain unintrusive.

Link to comment

I actually think (being US specific here) that an individual RSS feed by state for new caches is probably something that's doable, relatively uncomplicated and likely not an impact to the system.

 

I like the example above for South Carolina and I'll look to do the same thing for a site we're putting together for a Northern NJ group. It would just be a daily update based on a PQ, but a more timely feed direct from gc.com would of course be a more elegant solution.

 

I don't see this as any different then the previous discussion on the same sort of topic feeding Tivos which seemed to have been initiated by Jeremy.

Link to comment
Why is it the people that use this site to find hundreds, even thousands of caches generally have no problem with the way it works, and all the people that know a better way to do things only find 1-2 caches a year?

What difference does that make? If he comes up with a better way of doing things, it will still be a better way, no matter how many caches someone else finds.

 

I read elsewhere on this board that someone had logged 1000+ finds w/o using a GPS. Should we be rolling our eyes thinking about all technogeeks who think they need a GPS to find these caches?

 

After reading nafai's posts, I think it would be cool to have a more automated way of aggregating all the data I commonly upload to my PDA.

 

GeoBC

Link to comment
There is a way, become a Premium member and use Pocket Queries via GSAK. It makes the process VERY easy and sortable and selectable and just plain cool.

If you are answering my post:

 

PocketQueries aren't quite the same thing since it would require me to d/l the file and then process it with GSAK. I'm talking about a web app that collects a variety of RSS feeds (I d/l various web content to my PDA on a regular basis) and then massages it all into the various formats that I can then Hotsync onto my PDA.

 

My point is that ongoing improvements in services are a good thing. Just because PQ's work doesn't mean that there isn't room for improvement.

 

GeoBC

Link to comment

While I know scraping is in violation of the TOU and I would never want to violate those, as a .NET developer and Google Earth fan, it would be really cool to intregrate the two for Geocaching.com Premier Members.

 

It might look like this (and I'd be willing to pay to use such a feature). I fire up Google Earth and zoom in to a certain predetermined zoom-level. Once there, Google Earth passes the centered lat/long to (once again, a Premium Members only) Geocaching.com script which generates XML of caches within the viewable area.

 

For my purposes & style, this would be amazing and I would love to be part of the development volunteers for such an undertaking.

 

That's my two cents.

Link to comment
While I know scraping is in violation of the TOU and I would never want to violate those, as a .NET developer and Google Earth fan, it would be really cool to intregrate the two for Geocaching.com Premier Members.

 

It might look like this (and I'd be willing to pay to use such a feature). I fire up Google Earth and zoom in to a certain predetermined zoom-level. Once there, Google Earth passes the centered lat/long to (once again, a Premium Members only) Geocaching.com script which generates XML of caches within the viewable area.

 

For my purposes & style, this would be amazing and I would love to be part of the development volunteers for such an undertaking.

 

That's my two cents.

I don't know about the kind of funcionality you're asking for, but I do know that Google Earth will read GPX files (i.e., pocket queries) and display them.

Link to comment
I don't know about the kind of funcionality you're asking for, but I do know that Google Earth will read GPX files (i.e., pocket queries) and display them.

Oh I know! And that's what excites me about this. Basically, all I'm talking about doing is essentially just automating that GPX import by getting Google Earth to add another checkbox, EXACTLY like they do for the Degree Confluence Project (www.confluence.org). If they can do it for that, why not Geocaches too?

 

I get your point though. I suppose in the meantime, I can use Pocket Queries to give me a new GPX file of caches in my area. *shrug*

Link to comment
I don't know about the kind of funcionality you're asking for, but I do know that Google Earth will read GPX files (i.e., pocket queries) and display them.

 

What frustrates me about this was that it took some poking around on my own to find the Google Earth/Geocaching.com network link that did exactly as I originally requested. Since, geocaching.com has made this functionality a little more prevalent (thank you!) but it's disappointing that I was unable to find the assistance I originally sought within the forums.

 

Thanks for your replies anyway, however. They put me on the right track.

Link to comment

I'd also love to see an RSS feed of new caches. Plenty of sites offer RSS feeds of queries you input.

 

Yes, it is much like a pocket query, except that it is a standard format. (also: rss feeds generally update instantly.. pocket queries only once a day. This feed would have far less data than a pocket query, so this would be fine)

 

- You CAN make rss feeds passworded (ie premium members only)

- You CAN put ads in RSS feeds. This does not hurt gc's 'model'. It also does not stress the database any more than the 'instant notify' feature does. Basically this is the same as the instant notify feature, except via rss rather than email. I'd be happy if it were premium only.

 

Also, it's bizzare how people on this site who don't really understand what something is are so fast to bash it just because it's a 'new' idea.

Edited by benh57
Link to comment
I don't know about the kind of funcionality you're asking for, but I do know that Google Earth will read GPX files (i.e., pocket queries) and display them.

 

What frustrates me about this was that it took some poking around on my own to find the Google Earth/Geocaching.com network link that did exactly as I originally requested. Since, geocaching.com has made this functionality a little more prevalent (thank you!) but it's disappointing that I was unable to find the assistance I originally sought within the forums.

 

Thanks for your replies anyway, however. They put me on the right track.

 

In my experience the google earth<>Gc.com KML link is extremely unreliable and slow. What's more, for a particular cache i was looking at last night, google earth put the 'cache' icon about a block away from where google maps did!

 

-B

Link to comment
What's more, for a particular cache i was looking at last night, google earth put the 'cache' icon about a block away from where google maps did!

It's supposed to be that way

Please note that the coordinates used in Google Earth are only an approximation and can be up to 100 ft from the actual location. Do not use the coordinates in Google Earth for cache hunting. It is merely a viewing tool for getting a general idea of the cache location.
Link to comment
What's more, for a particular cache i was looking at last night, google earth put the 'cache' icon about a block away from where google maps did!

It's supposed to be that way

Please note that the coordinates used in Google Earth are only an approximation and can be up to 100 ft from the actual location. Do not use the coordinates in Google Earth for cache hunting. It is merely a viewing tool for getting a general idea of the cache location.

 

Interesting. That note doesn't say that it it is 'supposed to be that way', only that it is known to be that way. Is it on purpose, or is it a system limitation?

Link to comment

<snip>

Yes, it is much like a pocket query, except that it is a standard format. (also: rss feeds generally update instantly.. pocket queries only once a day. This feed would have far less data than a pocket query, so this would be fine)

<snip>

Actually, a RSS feed would most likely use more bandwidth than a pocket query. A pocket query is what is known as a push. The information is sent to your computer and Groundspeak has complete control over when and how often. An RSS feed is known as a pull because the information is send when you request it. The reason an RSS feed seems to update almost instantly is because users typically have their computers set to ask for the information every few minutes. Those little bits of frequently requested information quickly add up.

 

Also, it's bizzare how people on this site who don't really understand what something is are so fast to bash it just because it's a 'new' idea.

This isn't a 'new' idea. Do a forum search and you will see many threads on this topic.

Link to comment

Actually, a RSS feed would most likely use more bandwidth than a pocket query. A pocket query is what is known as a push. [...] An RSS feed is known as a pull

You may very well be right, but what you're leaving out of the equation is important: Pocket Queries are customized per user. Each one is only useful to a single user. Even if all users customized their PQ to be identical, Geocaching.com is still pushing out each one individually.

 

An RSS feed, however, would be a single stream for everybody, with possible individual filters on either client or server side.

 

Presuming the interest in each is equal (I know, I know), a very strong argument could be made that the latter would have less of a footprint from the perspective of burdening the system. However, your point that Geocaching.com can control when this service runs (ex: during the slow hours of the overnight, leaving the site unburdened during daylight hours) would no doubt be compelling enough to stick with the former (ie: existing) option.

 

... but the equation isn't as simple as Push-is-less-than-Pull

Edited by FfejNS
Link to comment
Interesting. That note doesn't say that it it is 'supposed to be that way', only that it is known to be that way. Is it on purpose, or is it a system limitation?

I'm pretty sure it's on purpose. Do this: Zoom in real far until just a few caches are in view. Then zoom just a smidge and wait until the display updates again. See the caches jump? Zoom some more. See them jump again. The system appears to be arbitrarily adding a random offset to each cache every time.

 

I think the point is that TPTB don't want you hunting caches based on that view alone. They want you to visit the cache page (or download a PQ) to get the exact coordinates.

Link to comment

Actually, a RSS feed would most likely use more bandwidth than a pocket query. A pocket query is what is known as a push. [...] An RSS feed is known as a pull

You may very well be right, but what you're leaving out of the equation is important: Pocket Queries are customized per user. Each one is only useful to a single user. Even if all users customized their PQ to be identical, Geocaching.com is still pushing out each one individually.

 

An RSS feed, however, would be a single stream for everybody, with possible individual filters on either client or server side.

 

Presuming the interest in each is equal (I know, I know), a very strong argument could be made that the latter would have less of a footprint from the perspective of burdening the system. However, your point that Geocaching.com can control when this service runs (ex: during the slow hours of the overnight, leaving the site unburdened during daylight hours) would no doubt be compelling enough to stick with the former (ie: existing) option.

The problem is that a single feed to everyone just isn't possible. There is no way to tell a computer to share the same feed among a group of computers. Each time someone requests a feed a copy has to be made and sent to each of those subscribers individually. It may look like each user is getting the same information but what is really happening is that they are each getting their own identical copy of the information.

 

Lets assume that a pull on average is a very small 1k and a very small amount, say 100, geocachers sign up for the feed and they, on average, request an update every 1minute. Because the server doesn't have a way to deliver the same 1k feed to 100 users it needs to make 100 copies of that 1k feed and then send each copy to each user. So that is 100k/min of bandwidth being used or 6000k (6Meg) each hour, or 72Meg each day or 2160Meg each month. 2.16Gig of bandwidth per month for only 100 users!

 

... but the equation isn't as simple as Push-is-less-than-Pull

Your right. The equation is more bang for the buck.

 

RSS feeds make sense on a news website where each user is likely to read the information retrieved. It doesn't make sense on website like Geocaching.com because a majority of new caches that will be retrieved via the feed will not be within' most users caching range. Unless the feed can be personalized it is worthless to the user and simply wastes bandwidth.

 

Another problem comes with personalizing feeds and that is the CPU time it takes to personalize each feed for each user. One of the reasons for the limit on the number of Pocket Queries each user can create has to do with CPU usage. Nothing is perfect but Pocket Queries do deliver the most useful information in the most timely manner for the resourses that are available. In other words Pocket Queries deliver the most band for the buck.

Link to comment
Another problem comes with personalizing feeds and that is the CPU time it takes to personalize each feed for each user. One of the reasons for the limit on the number of Pocket Queries each user can create has to do with CPU usage.

 

That doesn't make sense. I can run at most 35 PQs per week; having 40, 400 or 4000 PQs lying dormant associated with my user account doesn't increase CPU usage; at worst it increases database storage by a few kilobytes.

Link to comment
Another problem comes with personalizing feeds and that is the CPU time it takes to personalize each feed for each user. One of the reasons for the limit on the number of Pocket Queries each user can create has to do with CPU usage.

 

That doesn't make sense. I can run at most 35 PQs per week; having 40, 400 or 4000 PQs lying dormant associated with my user account doesn't increase CPU usage; at worst it increases database storage by a few kilobytes.

 

You are correct. The amount of CPU usage to display an already generated PQ is very small compared to the CPU usage to generate the PQ. That is one of the reasons why there are limits on PQ generation and no limits on the number of times you can view the PQ.

 

I'm not to sure where you are trying going with this in relation to RSS.

Link to comment
Another problem comes with personalizing feeds and that is the CPU time it takes to personalize each feed for each user. One of the reasons for the limit on the number of Pocket Queries each user can create has to do with CPU usage.

 

That doesn't make sense. I can run at most 35 PQs per week; having 40, 400 or 4000 PQs lying dormant associated with my user account doesn't increase CPU usage; at worst it increases database storage by a few kilobytes.

 

You are correct. The amount of CPU usage to display an already generated PQ is very small compared to the CPU usage to generate the PQ. That is one of the reasons why there are limits on PQ generation and no limits on the number of times you can view the PQ.

 

You're still not making sense. I can modify my queries all I want; and since a modified query might be almost but not quite identical to the old one, or totally different from the original query, this must be exactly as computationally expensive as creating a new query (since you are, in effect, creating a new query).

Link to comment
Another problem comes with personalizing feeds and that is the CPU time it takes to personalize each feed for each user. One of the reasons for the limit on the number of Pocket Queries each user can create has to do with CPU usage.

 

That doesn't make sense. I can run at most 35 PQs per week; having 40, 400 or 4000 PQs lying dormant associated with my user account doesn't increase CPU usage; at worst it increases database storage by a few kilobytes.

 

You are correct. The amount of CPU usage to display an already generated PQ is very small compared to the CPU usage to generate the PQ. That is one of the reasons why there are limits on PQ generation and no limits on the number of times you can view the PQ.

 

You're still not making sense. I can modify my queries all I want; and since a modified query might be almost but not quite identical to the old one, or totally different from the original query, this must be exactly as computationally expensive as creating a new query (since you are, in effect, creating a new query).

 

You seem to arguing just to argue. Everything you say I agree with. Are you going to relate this the topic of this thread, RSS.

Link to comment
Guest
This topic is now closed to further replies.
×
×
  • Create New...