Skip to main content

Posts

Showing posts from 2011

Planetary states API

Update: This API is deprecated. Use the new json api instead. I needed a way to deal with planetary positions and velocities and found NASA's HORIZONS and the ephemerides . But I wanted a simpler interface than telnet or lugging around the massive ephemeris files with my applications. So instead, I wrote a simple JSON api for dealing with ephemeris files. Suppose one wanted to get the chebyshev coefficients for computing mercury's state for today's date (November 5th), the URL query would look like this: http://www.astro-phys.com/api/coeffs?date=2011-11-5&bodies=mercury Which would return a JSON object whose structure looks like this: { "date": 2455870.5, "results": { {"mercury": { "coeffs": ... "start": 2455856.5, "end": 2455872.5 } } } Where "coeffs" contains the chebyshev coefficients for evaluating the state of mercury between the julian dates 2455856.5 and 245587

Davy's law

Davy's Law: Computers can't compute/predict themselves. No physical computer is capable of losslessly determining it's effect on every state in it's state space by means of internal simulation. That is, the only way it can accurately achieve the result of a state is by actually being put into that state. The limiting factor here is the necessity of storing the entirety of it's state AND its rule set within it's alloted state (with room to spare for performing computation). This would be in violation of the pigeonhole principle 's effect on lossless compression . Not to mention that if it were possible, it could simulate it simulating it simulating it... And unless it can solve the halting problem , that's probably just not a good feature for a system to have. Davy's law does not prohibit computers from computing with isolated portions of it's state. It also does not state that there aren't some global state computations that are possible. Th

Naive Bayes (and author detection)

I've been playing around with various classification algorithms lately, so I wrote a really simplified discrete naive bayes classifier in Python. No emphasis on sample correction, simplicity was key here, but it still works quite well. from operator import itemgetter from collections import defaultdict class BayesClassifier: def __init__(self): self.total_count = 0 # Observations of individual attributes self.class_count = defaultdict(int) # Observations of cls self.attrs_count = defaultdict(int) # Observations of (cls, attrs) self.correction = 0.0001 # Prevent multiplication by 0.0 def train(self, cls, attrs): ''' Add observation of 'attrs' as being an instance of 'cls' ''' self.class_count[cls] += 1 for attr in attrs: self.attrs_count[(cls, attr)] += 1 self.total_count += 1 def rate(self, cls, attrs): ''' Return probability rating of 'attrs' bei