I am starting web app in Django, which must provide one simple task: get all records from DB which are close enough to other record.
For example: Iam in latlang (50, 10), and I need to get all records with latlang closer than 5km from me.
I found that geodjango thing called GeoDjango, but it contains a lot of other dependencies and libraries like GEOS, POSTGIS, and other stuff which i don't really need. I need only this one range functionality.
So should I use GeoDjango, or just write my own range calculation query?
Most definitely not write your own. As you get more familiar with geographic data you will realize that this particular calculation isn't at all simple see for example this question for a detailed discussion. However most of the solutions (answers) given in that question only produce approximate results. Partly due to the fact that the earth is not a perfect sphere.
On the other hand if you use Geospatial extensions for mysql (5.7 onwards) or postgresql you can make use of the ST_DWithin function.
ST_DWithin — Returns true if the geometries are within the specified distance of one another. For geometry units are in those of spatial reference and For geography units are in meters and measurement is defaulted to use_spheroid=true (measure around spheroid), for faster check, use_spheroid=false to measure along sphere.
ST_DWithin makes use of spatial indexes which home made solutions will be unable to. WHen GeoDjango is enabled, ST_DWithin becomes available as a filter to django querysets in the form of dwithin
Last but not least, if you write your own code, you will have to write a lot of code to test it too. Whereas dwithin is thoroughly tested.
Related
For context, I am working on a project that involves controlling a cursor with eye-tracking data. To achieve this, I am imitating the techniques described in the paper below:
https://iopscience.iop.org/article/10.1088/1741-2560/11/5/056026
So far I have successfully implemented the Kalman filter calibration process described in sections 2.2.1 and 2.2.2. The focus of my question is 2.2.3 which briefly describes an additional filter modeled after a mass spring damper system:
I would like to implement this filter for my project. However, the paper does not provide any additional details or references, and frankly I'm not sure where to start. I have found some resources for simulating mass spring damper systems in python, particularly the document below:
https://www.halvorsen.blog/documents/programming/python/resources/powerpoints/Mass-Spring-Damper%20System%20with%20Python.pdf
My problem with the examples in this document is that they each take a description of the system and an array of time steps as inputs and return the state of the system at those time steps as outputs. I need something that will behave like a filter that takes a single state as input and returns a filtered state as output. I'm not sure how to adapt the examples in this document to behave this way or if it even makes sense to try and do so.
Could someone please point me in the right direction? Links to resources, useful python library recommendations, or example code would be much appreciated.
I am working on Python 2.7. I want to create nomograms based on the data of various variables in order to predict one variable. I am looking into and have installed PyNomo package.
However, the from the documentation here and here and the examples, it seems that nomograms can only be made when you have equation(s) relating these variables, and not from the data. For example, examples here show how to use equations to create nomograms. What I want, is to create a nomogram from the data and use that to predict things. How do I do that? In other words, how do I make the nomograph take data as input and not the function as input? Is it even possible?
Any input would be helpful. If PyNomo cannot do it, please suggest some other package (in any language). For example, I am trying function nomogram from package rms in R, but not having luck with figuring out how to properly use it. I have asked a separate question for that here.
The term "nomogram" has become somewhat confused of late as it now refers to two entirely different things.
A classic nomogram performs a full calculation - you mark two scales, draw a straight line across the marks and read your answer from a third scale. This is the type of nomogram that pynomo produces, and as you correctly say, you need a formula. As mentioned above, producing nomograms like this is definitely a two-step process.
The other use of the term (very popular, recently) is to refer to regression nomograms. These are graphical depictions of regression models (usually logistic regression models). For these, a group of parallel predictor variables are depicted with a common scale on the bottom; for each predictor you read the 'score' from the scale and add these up. These types of nomograms have become very popular in the last few years, and thats what the RMS package will draft. I haven't used this but my understanding is that it works directly from the data.
Hope this is of some use! :-)
I've already posted it on robotics.stackexchange but I had no relevant answer.
I'm currently developing a SLAM software on a robot, and I tried the Scan Matching algorithm to solve the odometry problem.
I read this article :
Metric-Based Iterative Closest Point Scan Matching
for Sensor Displacement Estimation
I found it really well explained, and I strictly followed the formulas given in the article to implement the algorithm.
You can see my implementation in python there :
ScanMatching.py
The problem I have is that, during my tests, the right rotation was found, but the translation was totally false. The values of translation are extremely high.
Do you have guys any idea of what can be the problem in my code ?
Otherwise, should I post my question on the Mathematics Stack Exchange ?
The ICP part should be correct, as I tested it many times, but the Least Square Minimization doesn't seem to give good results.
As you noticed, I used many bigfloat.BigFloat values, cause sometimes the max float was not big enough to contain some values.
don't know if you already solved this issue.
I didn't read the full article, but I noticed it is rather old.
IMHO (I'm not the expert here), I would try bunching specific algorithms, like feature detection and description to get a point cloud, descriptor matcher to relate points, bundle adjustement to get the rototraslation matrix.
I myself am going to try sba (http://users.ics.forth.gr/~lourakis/sba/), or more specifically cvsba (http://www.uco.es/investiga/grupos/ava/node/39/) because I'm on opencv.
If you have enough cpu/gpu power, give a chance to AKAZE feature detector and descriptor.
For my app, I need to determine the nearest points to some other point and I am looking for a simple but relatively fast (in terms of performance) solution. I was thinking about using PostGIS and GeoDjango but I think my app is not really that "geographic" (I still don't really know what that means though). The geographic part (around 5 percent of the whole) is that I need to keep coordinates of objects (people and places) and then there is this task to find the nearest points. To put it simply, PostGIS and GeoDjango seems to be an overkill here.
I was also thinking of django-haystack with SOLR or Elasticsearch because I am going to need a strong, strong text search capabilities and these engines have also these "geographic" features. But not sure about it either as I am afraid of core db <-> search engine db synchronisation and hardware requirements for these engines. At the moment I am more akin to use posgreSQL trigrams and some custom way to do that "find near points problem". Is there any good one?
To find points or bounding boxes that are near each other, consider using the Rtree Python package. This uses a similar spatial index technique as PostGIS, except it is not database software and can be used in software. I've tested faster speeds than from PostGIS to find near points on millions of objects.
See examples in the tutoral to get a good feel to find nearest objects.
You're probably right, PostGIS/GeoDjango is probably overkill, but making your own Django app would not be too much trouble for your simple task. Django offers a lot in terms of templating, etc. and with the built in admin makes it pretty easy to enter single records. And GeoDjango is part of contrib, so you can always use it later if your project needs it.
check out shapely. Looks like the object's project() method may be what you're looking for.
I have been going through trying to find the best option on Google App Engine to search a Model by GPS coordinates. There seem to be a few decent options out there such as GeoModel (which is out of date) but the new Search API seems to be the most current and gives a good bit of functionality. This of course comes at the possibility of it getting expensive after 1000 searches once this leaves the experimental zone.
I am having trouble going through the docs and creating a coherent full example to be able to use the Search API to search by location and I want to know if anyone has examples they are willing to share or create to help make this process a little more straightforward. I understand how to create the actual geosearch query but I am unclear as to how to glue that together with the construction and indexing of the document.
I don't know of any examples that show geo specifically, but the process is very much the same. You can index documents containing GeoFields, each of which has a latitude and a longitude. Then, when you construct your query, you can:
limit the results by distance from a fixed point by using a query like distance(my_geo_field, geopoint(41, 65)) < 100
sort by distance from a point with a sort expression like distance(my_geo_field, geopoint(55, -20))
calculate expressions based on the distance between points by using a FieldExpression like distance(my_geo_field, geopoint(10, -30))
They work pretty much like any other field, except you can use the distance function in the query and expression languages. If you have any specific questions, feel free to ask here.