Jump to content

Rosie_Posie

+Premium Members
  • Posts

    47
  • Joined

  • Last visited

Everything posted by Rosie_Posie

  1. If I can find a way to get this kind of data, I was definitely thinking of going this direction with part of my project. My problem is tracking down the data....... I can do it, I just need more hours in the day to be able to work on this!!! ahhhh! :-P
  2. This makes a lot of sense. So this would support the idea that roads, buildings, etc. will probably not make that much difference in terms of a maximum density of an area... I think what I am going to focus on is the randomness of how people hide them and do a simulation like someone posted earlier. Focus on the human factor more so than the physical obstacle factors.
  3. I used counties for two reasons. 1) because I found lots of awesome data for the number of geocaches in each county in the US and so it was easy to access everything. and 2) because counties have nice distinct boundaries that I feel are easier to define. And I used 3 decimal points because when you get into the less populated areas, say Jamestown, North Dakota, it only had a geocache density of 0.079. My project is not ONLY looking at high saturation areas. It is looking at over 3,000 counties.
  4. Actually, that is NOT good. I would take it as an indication that the cache density is at most only indirectly related to the population density. Take a look outside the U.S. and Europe and you'll find that population density doesn't seem to be related at all. Bangladesh is considered the most densely populated country in the world. There are 3 caches in the entire country. I am not trying to create a model for all geocaches in the world. Only in the US. There are too many differences in the various countries to do EVERYTHING. Also, I meant to mention earlier, there is a maximum of sorts on the population density that this equation will work for. When you think of large cities in the US with high population densities, it is because they are literally stacking people on top of each other in large apartment buildings and skyscrapers. You cannot do that with geocaches obviously. So in my paper I discuss the fact that this equation only works in location with a population density less than 3000 or so. After that it does not work as well. Again, this is certainly not a perfect model. I am simply trying to give some mathematical analysis to something that is quite random at first glance...
  5. Actually, that is NOT good. I would take it as an indication that the cache density is at most only indirectly related to the population density. Using r^2 or significance values can be very tricky. Here is a hint that may help you find a better model: the population density in the LA area is roughly constant but in some areas there is a high cache density and in others (e.g. Compton) it is very low. I would hypothesize that cache density is related to 3 primary factors: population density, park area, and average income. When I break it down by state, there are many that have an r^2 as high as 0.999. I am not saying this is going to be a perfect model, but it will work for what I am trying to accomplish. And I am certainly not claiming causation by any means, just that there is a relationship. Obviously there are more variables involved than just population density. I have an entire section of my report about other things to include in future research. But there is sooooo much data and soooo many variables, right now I am just focusing on one or two areas. If my r^2 = 0.6382 then my r = 0.7989 http://www.drtomoconnor.com/3760/3760lect07.htm CORRELATION The most commonly used relational statistic is correlation, and it's a measure of the strength of some relationship between two variables, not causality. Interpretation of a correlation coefficient does not even allow the slightest hint of causality. The most a researcher can say is that the variables share something in common; that is, are related in some way. The more two things have something in common, the more strongly they are related. There can also be negative relations, but the important quality of correlation coefficients is not their sign, but their absolute value. A correlation of -.58 is stronger than a correlation of .43, even though with the former, the relationship is negative. The following table lists the interpretations for various correlation coefficients: .8 to 1.0 very strong .6 to .8 strong .4 to .6 moderate .2 to .4 weak .0 to .2 very weak
  6. Actually, that is NOT good. I would take it as an indication that the cache density is at most only indirectly related to the population density. Using r^2 or significance values can be very tricky. Here is a hint that may help you find a better model: the population density in the LA area is roughly constant but in some areas there is a high cache density and in others (e.g. Compton) it is very low. I would hypothesize that cache density is related to 3 primary factors: population density, park area, and average income. When I break it down by state, there are many that have an r^2 as high as 0.999. I am not saying this is going to be a perfect model, but it will work for what I am trying to accomplish. And I am certainly not claiming causation by any means, just that there is a relationship. Obviously there are more variables involved than just population density. I have an entire section of my report about other things to include in future research. But there is sooooo much data and soooo many variables, right now I am just focusing on one or two areas.
  7. Actually, that is NOT good. I would take it as an indication that the cache density is at most only indirectly related to the population density. Take a look outside the U.S. and Europe and you'll find that population density doesn't seem to be related at all. Bangladesh is considered the most densely populated country in the world. There are 3 caches in the entire country. I am not trying to create a model for all geocaches in the world. Only in the US. There are too many differences in the various countries to do EVERYTHING.
  8. I think this is where you're at. Even within a city, the variables will vary widely from block to block. Then you have variations from city to city, state to state, country to country, etc. They're called variables for a good reason! As for calling Seattle saturated, I think they're using a less literal definition. Rather than being mathematically saturated, where it's literally impossible to fit another cache due to the proximity guideline, I think they're referring to effective saturation. That's where all the "good" spots have been covered. You may be able to fit more caches, but they'd likely be lame (film canister behind a sign on the side of the road, LPC, etc.), illegal (cache on the side of an interstate, private property, etc.), or just ill-advised (hidden near a sensitive location). Sigh... Ok thats what I was starting to think. Just not very helpful for a mathematical approach. Oh well, I'm going to keep working on this if any one has any suggestions. The title of the project is simply A Mathematical Analysis of Geocaching. so there are several directions I could take this... I have data for over 3,000 counties in the USA including land area (sq. miles), population, geocaches, population density, geocache density...
  9. My interest in Geocache Density was sparked when I was reading some discussion posts about Seattle being completely saturated. But then when I was looking at my data, it was no where close to a mathematical maximum density of 118 or so per sq. mile. Nothing has been close to that... So what makes geocachers call an area saturated? (Other than the few that say an area is saturated only because they are upset they cannot hide the cache they wanted to hide. lol) But the conversation about Seattle seemed to imply that it truly had no more room for geocaches. Is that the case? How is that determined? I am trying to relate Geocache Density to Population Density in my project as a way to create a mathematical model for predicting the future geocache density of an area based on the future population predictions from the census. My data has shown a correlation coefficient between the two variables of r^2 = 0.6382 which is pretty good for this kind of data I think. So that was why I was trying to figure out a certain number for the true maximum density so that I could say something like "Since this CITY will reach a population density of X, our model shows it will potentially have a geocache density of Y. Since Y is greater than some MAXIMUM we can conclude that this CITY has the potential of being completely saturated with geocaches in the next # years." Is this something that I could potentially get to work or am I just going to be going in circles with data that has too many variables to make any true conclusions/predictions?
  10. Yeah I was thinking about this when I was looking at some of my counties with the highest density of geocaches. They are always the smallest counties in the list..
  11. I absolutely love this idea!!! Just the kind of thing I can do for this project! :-)
  12. Hello everyone! I have been work on a graduate project about Geocaching and I was hoping to get your thoughts on some things. I want to know what the maximum geocache density would be in a "saturated" area. I have searched the forums for this and found the posts talking about "hexagonal packing" and those were very useful. But what I am hoping to figure out is what is the true maximum when you take into consideration all of the space taken up by buildings and roads and also when you take into account that people do not hide their caches according to some hexagonal packing rule. For example, in my findings the COUNTY of New York has a geocache density of 8.6 geocache per sq. mile. Is this high enough to consider the county saturated when you take into account all of the space taken up by streets, buildings, etc?? I found on the NY planning website a break down of land use. It showed what percentage of the county was taken up by buildings streets, etc. Using only the remaining land area, the county had a geocache density more like 54 geocaches per sq. mile. My problem is I do not have that sort of information for all of my data. Sooooo I need to find a number that represents a true maximum density. Below is some of the data that I have just to give you an idea... State County # of Geocaches Area (sq. miles) Geocache Density Virginia Fredericksburg City 158 11 15.019 Virginia Falls Church City 21 2 10.500 Nevada Clark 6,872 737.67 9.316 New York New York 290 33.68 8.610 Virginia Alexandria City 129 15 8.388 Minnesota Ramsey 1,399 170.16 8.222 Colorado Broomfield 269 33 8.152 Virginia Arlington 197 26 7.586 Illinois Du Page 2,381 336.87 7.068 Oregon Multnomah 3,060 465.7 6.571 Colorado Denver 949 154.88 6.127 California Orange 5,567 947.91 5.873 Minnesota Hennepin 3,544 606.43 5.844 Kentucky Jefferson 2,305 398.6 5.783 New Jersey Union 497 105.46 4.713 Alabama Baldwin 9,335 2027.08 4.605 Indiana Marion 1,839 403.11 4.562 Texas Tarrant 4,038 897.56 4.499 California Contra Costa 3,591 802.18 4.477 Virginia Danville City 194 44 4.415 Illinois Cook 7,121 1634.89 4.356 New Jersey Passaic 841 197.07 4.268 Ohio Hamilton 1,752 412.81 4.244 Michigan Macomb 2,349 569.81 4.122 Utah Salt Lake 3,318 807.83 4.107 Pennsylvania Philadelphia 572 142.68 4.009 Minnesota Anoka 1,748 446.28 3.917 California Alameda 3,156 821.26 3.843 Virginia Lynchburg City 191 50 3.838 Wisconsin Washington 1,648 435.92 3.781 Texas Dallas 3,429 908.87 3.773 Michigan Wayne 2,516 672.26 3.743 Oregon Washington 2,641 726.43 3.636 California Santa Clara 4,704 1304.53 3.606 Rhode Island Bristol 159 44.71 3.556 New Jersey Essex 460 129.59 3.550 New Jersey Morris 1,707 481.36 3.546 California Sacramento 3,512 995.65 3.527 Texas Harris 6,246 1777.89 3.513 Michigan Oakland 3,173 908.07 3.494 Ohio Montgomery 1,595 464.39 3.435 Virginia Richmond City 215 63 3.434 New Jersey Camden 778 227.59 3.418 Nebraska Douglas 1,159 339.65 3.412 California Ventura 7,514 2208.36 3.403 Rhode Island Kent 626 188.01 3.330 California San Diego 14,766 4525.92 3.263 North Carolina Wake 2,788 857.5 3.251 Texas Collin 2,872 885.91 3.242 Pennsylvania Allegheny 2,414 744.73 3.241 Ohio Franklin 1,750 543.35 3.221 New Mexico Bernalillo 3,661 1168.77 3.132 Colorado Jefferson 2,415 778.2 3.103 Virginia Fairfax 1,250 407 3.073 Delaware New Castle 1,513 493.53 3.066 Texas Denton 2,772 910.55 3.044 Iowa Polk 1,787 591.94 3.019 Pennsylvania Montgomery 1,423 487.47 2.919 Minnesota Washington 1,228 423.19 2.902 North Carolina Guilford 1,902 657.73 2.892 Pennsylvania Bucks 1,774 622.15 2.851 Georgia Cobb 974 344.54 2.827 Indiana Vanderburgh 666 235.76 2.825 California San Francisco 655 231.89 2.825 Tennessee Sullivan 1,204 429.71 2.802 District of Columbia 191 68.36 2.794 California San Mateo 2,065 741.07 2.787 Oklahoma Oklahoma 1,995 718.38 2.777 Illinois Kane 1,431 524.12 2.730 California Los Angeles 12,846 4752.32 2.703 Georgia Fayette 538 199.27 2.700 Washington Clark 1,743 656.26 2.656 Ohio Summit 1,086 420.1 2.585 Colorado Arapahoe 2,080 805.47 2.582 New Hampshire Rockingham 2,011 794.04 2.533 New Jersey Burlington 2,062 819.48 2.516 Kentucky Franklin 531 212.13 2.503 Minnesota Dakota 1,463 586.36 2.495
  13. AHA! I got it to work. Sooo the first time I did this query I messed up the coordinates. I fixed them but then all this time I was trying to get it to resend an old query with new coordinates. Apparently it won't do that and so I just created a brand new query and it worked. So we're all good! Thanks for your help!!
  14. It was only like 300 and some because I was looking at a specific area trying to test it, but I never got an e-mail for it. So I was just looking at the list of geocaches that it gave me when I clicked "preview." I might have to try it again to figure out why it didn't go to my e-mail... The preview window is great to use to look at it on the map, but the way to download is a screen back from there. Ok, so after I submitted the query, it says "Thanks! Your pocket query has been modified and currently results in 381 caches. You can preview the search on the nearest cache page." and that is where I was clicking "preview the search" and it brings me to a page that says "Running Pocket Query" with the list of all the geocaches from my query. At the bottom of that page was where I was clicking "Check all" and "Download waypoints" I am not seeing anything on the first page to let me download. And I just tried a different e-mail and the query still isn't showing up...hmmmm Go back to the main screen, your profile, then Pocket Queries. There should be Active Pocket Queries, and Pocket Queries ready for Download on little tabs. Pick the Pocket Queries ready for Download then click on the query that you want to work with, save it to a location you can find again. Open GSAK, on the upper left there should be a little file folder, click on that, browse and navigate to the PQ file you just saved. Ok so the query is listed under Active Pocket Queries but it says No Downloads Available under the Pocket Queries Ready for Download.
  15. Are you sure you're loading a .GPX file into GSAK? You can't use .LOC files as they don't contain the info you need. Hmmm that is a good question. The pocket query was not going through to my e-mail correctly but it came up in a separate window so I just "checked all" and clicked "download waypoints." It doesn't give me the option of .GPX or .LOC there. So maybe that is why it is not working... If you went with 1,000 wpts it doesn't go through your email. You have to go back to the PQ window and download from there. It was only like 300 and some because I was looking at a specific area trying to test it, but I never got an e-mail for it. So I was just looking at the list of geocaches that it gave me when I clicked "preview." I might have to try it again to figure out why it didn't go to my e-mail... The preview window is great to use to look at it on the map, but the way to download is a screen back from there. Ok, so after I submitted the query, it says "Thanks! Your pocket query has been modified and currently results in 381 caches. You can preview the search on the nearest cache page." and that is where I was clicking "preview the search" and it brings me to a page that says "Running Pocket Query" with the list of all the geocaches from my query. At the bottom of that page was where I was clicking "Check all" and "Download waypoints" I am not seeing anything on the first page to let me download. And I just tried a different e-mail and the query still isn't showing up...hmmmm
  16. Are you sure you're loading a .GPX file into GSAK? You can't use .LOC files as they don't contain the info you need. Hmmm that is a good question. The pocket query was not going through to my e-mail correctly but it came up in a separate window so I just "checked all" and clicked "download waypoints." It doesn't give me the option of .GPX or .LOC there. So maybe that is why it is not working... If you went with 1,000 wpts it doesn't go through your email. You have to go back to the PQ window and download from there. It was only like 300 and some because I was looking at a specific area trying to test it, but I never got an e-mail for it. So I was just looking at the list of geocaches that it gave me when I clicked "preview." I might have to try it again to figure out why it didn't go to my e-mail...
  17. So one thing that you've touched on, but you need to clarify in your notes is that the number of caches fluctuate. You'll need to have some snapshot dates, probably at regular intervals. The lovely thing about PQs is once you set them up you can get them to run once a week which is a nice logical interval. Yes, I was thinking about that. Maybe coming up with a growth rate and a "death" rate for when caches are archived. And this would be different for different areas I think. Like maybe urban areas have a faster "death" rate because of more muggles?
  18. Are you sure you're loading a .GPX file into GSAK? You can't use .LOC files as they don't contain the info you need. Hmmm that is a good question. The pocket query was not going through to my e-mail correctly but it came up in a separate window so I just "checked all" and clicked "download waypoints." It doesn't give me the option of .GPX or .LOC there. So maybe that is why it is not working...
  19. On mine the last GPX column is right next to the placed date, maybe hide the last GPX column. Also GSAK comes with sample data and a tutorial if I remember, you might want to take a bit of time and fiddle with it. Yeah mine are right next to each other too and they both say 4/23/14 for all of the caches I downloaded. Not sure why though because it only happened when I downloaded multiple at a time...
  20. I am getting my Masters in Mathematics. So really I could have done a mathematical analysis of just about anything. I just enjoy Geocaching and thought it would be a good idea to spend the next several months of my life working on a topic I enjoy. I am still early in the process here but I want to do is develop a mathematical model to represent the number of geocaches in an area. I need data for how many caches are in all these different areas so that I can get some numbers like -average number of caches in an area with high population density -average number of caches in an area with low population density -rate caches are being planted in the different areas (high density areas, low density areas) -How many geocaches can a certain area hold while following all of the rules perfectly (ie. caches are exctly 0.1 miles apart) -How many geocaches does a TYPICAL area hold, since people do not plant them the perfect distance apart, and there are buildings in the way etc. I want to create a model that when I put in information for certain variables (maybe population density or something, not sure yet) it would tell me how many caches are typically in the area and/or whether the area will be saturated after a certain amount of time. I don't know any of this for sure. I am still bouncing ideas around...
  21. So I am trying to get GSAK to do what I am looking for and I have gotten it to take a pocket query and load it into GSAK but all of the caches have a placed date of 4/23/14. And the spot for logs or last found dates are all blank. BUT when I load a single geocache into GSAK all of the information comes up correctly. What am I doing wrong?
  22. Thanks! I was just looking at GSAK and it is definitely a little overwhelming but I think I might be able to make it work for me...
  23. Also, I am willing to pay for the premium membership if it will help me with this task. I just am not sure if the premium membership does something that I can't do now that would make it easier. If someone with a premium membership could post and let me know if they think it would help. :-) Can a premium member download lists into excel?
  24. Hmm, fiddling around with the site.... Maybe there is a way to do it per city and within a certain radius. Say "All geocaches within a 25 mile radius of Seattle" Still not sure the best way to get that list from the website to my excel sheet though....
×
×
  • Create New...