hcarter333 / rm-rbn-history Goto Github PK
View Code? Open in Web Editor NEWMapping reverse beacon spots and QSOs of the Rockmite 20 at KD0FNR
Mapping reverse beacon spots and QSOs of the Rockmite 20 at KD0FNR
Related to #13
Use kml timestamp definitions to animate maps on Google Earth using kml. Reference material on kml timestamps:
https://developers.google.com/kml/documentation/time#animating
The file needs to be tested. That also means getting the current 'scriptiness' out of it.
Modify the actions for this project to utilize auto_geo_update to automatically create kml maps. The advantage of this will be that the mapping data will be revision controlled here. Also, revision control the output maps in a folder named maps.
Create a python script that adds new QSO information, (including and especially the station's geographic information), to rm_rnb_history_pres.csv. The script will require the following input:
Command line:
The program will then write QSOs to rm_rnb_history_pres.csv in the standard format:
id | tx_lng | tx_lat | rx_lng | rx_lat | timestamp | dB | frequency | Spotter |
---|
Where the dB field will be the rx RST.
rm_rnb_history_pres.csv will then be automatically pushed to the repository. Use the flow found in scrape.yml to update the repository.
This project will be divided into the following phases which will each get their own issue:
Add a feature that determines if input is invalid. For now there are two kinds of invalid that should be detected:
Add the feature to qso_spot_kml.py. Add test cases for each of the above. Use this feature to create a tests directory in the project.
It will be interesting to see if the station locations change with height of the F2 layer.
We need an easy way for people to add other QSOs in addition to spots. This will be done with the following steps:
add_qso.py qso_locs.txt >> rm_rnb_history_pres.csv rm qso_locs.txt
The key field will be assigned a random 32 bit number.
Widen the search criteria back up to 'kd0*' then add filters that search through RBN spots that match kd0* based on the known the frequencies (about 8 of them) the Rockmite has been spotted on.
At the moment, map_qso.py -hh is piped into rm_rnb_history_pres.csv if the script encounters an input error after one or more QSOs have been successfully processed, then when the user corrects the error the earlier, successful QSOs are duplicated.
Proposed fix: pipe the output of map_qso.py -hh into an intermediate file. In the next step, (which won't be reached unlss map_qso.py -hh completes successfully), append, (via cat), the intermediate file onto the end of rm_rnb_histor_pres.csv. In this manner the QSOs will only be added once.
They were pulled out by the latest change of the scraper file https://github.com/hcarter333/rm-rbn-history/blob/6fd42ace7837218b013e9fa7b820f2dbb51da169/.github/workflows/scrape.yml
Make the KD0FNR station location a variable so it can be easily reassigned when the station is remote.
Remove the git diff statement from scrape.yml. It shouldn't be needed anymore now that we have key differentiation thanks to nkc.py.
Document the completed feature for adding QSOs to the presentation csv file.
Add an additional two columns to the kepler.gl station spot csv data file. The first is to track the strength of the received signal from the Rockmite. The second is to label the stations that spotted the rockmite.
Unify the process for handling qso and spotter data.
Perhaps do this by making both kml and geojson output into tools that can be chosen instead of being arbitrarily executed.
The existing flow could be left in place and data simply stored into a separate file as described in #12 .
ROYGBIV gives us 7 of the 9 value we need. (there is no 0). Use brown for 1 and white for 9.
This feature is already prototyped using the pseudocode at
https://gis.stackexchange.com/questions/18584/how-to-find-a-point-half-way-between-two-other-points
The prototype code was just released.
Patching it into a debug file for #13 gives the following pretty cool result:
Because sometimes you only make one QSO.
Add the expe_kml.py arguments -b and -e as optional arguments to map_qso.py. This will allow the user to specify what time range around a QSO or group of QSOs to include RBN spots from. Also include a -rhh (range: +/- half hr) option to make the range one hour on either side of the QSOs.
Given the contents of qso_update.csv, the script should output lines formatted as:
<tr><td>KBTEST</td><td>539</td><td>559</td><td>16:42</td><td>14058.3 kHz</td></tr>
with one line per QSO.
Change the script to expect the following three lines at the top of the QSO file:
title
longitude
latitude
This will probably also change so that the file pulled from is the good old qso_locs.txt
The '/' character was causing the file open to fail because it was not changed to an underscore first. This has been fixed.
This will enable tracking the progression of the maximum usable frequency vs the QSOs/spots from the Rockmite.
Add callsign, date and time, reported signal strength, and local (and remote?) ionosonde data to kml lines from station to station. Perhaps add bracketing ionosonde times to the time of the QSO. Search the ionosonde diretory for the day such as
https://lgdc.uml.edu/common/DIDBDayStationStatistic?ursiCode=PA836&year=2023&month=3&day=5
The finished feature should look similar to
and
Add a field that stores tx rst this will make each record complete QSO-wise.
Use code that looks like this
map_addr = 'http://maps.google.com/maps/api/staticmap?size=85x85&maptype=hybrid' + '&markers=color:blue|label:T|' + tx_latlng.lat() + ',' + tx_latlng.lng() + '&markers=color:blue|label:R|' + rx_latlng.lat() + ',' + rx_latlng.lng() + '&path=color:0xff0000ff|weight:5|' + tx_latlng.lat() + ',' + tx_latlng.lng() + '|' + rx_latlng.lat() + ',' + rx_latlng.lng() + '&sensor=false';
To test this one on github itself, I'll have to also learn how to encrypt passwords here.
Write a method that returns a QRZ session id for use in the script described in #32 .
def get_qrz_session():
Document the step by step process for adding calls to Google Earth as kml files.
If nothing else so I don't forget how to add log entries.
Convert two linestrings to a polygon in the simplest way possible. Make their timestamps different. Determine if this does in fact enable animation in Google Earth Studio.
See the kepler animation docs here
https://docs.kepler.gl/docs/user-guides/h-playback
Create a python method that given a QRZ.com data session id and a valid ham radio callsign, returns the mailing address,(organized with + signs as separators as detailed here ), associated with that callsign. (Don't sweat the stuff about encoding the plus sign. The call to requests.get dos that for you.)
The geolocation request needs an address formatted like this example:
https://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&key=YOUR_API_KEY
The date parsing code that reads the file, not the code that used to read the command line, isn't issuing error messages, it's just stopping the entire script. Actually, it should issue a message, and then stop the entire script. No data left behind.
Code located in expe_kml_defs.py
The most recent spot:
"1522481322": [ "K5TR", "14057.7", "KD0FNR", 3, 19, "0005z 28 Jan" ]
should result in a row with K5TR and a time of 16:05. However, the latest entry in the current version of rm_rnb_history_pres.csv is at a time of 15:47 on today 1/27/23.
I'm going to delete all entries from the history file after 15:47 to see if the issue will fix itself.
Retrieve the session handle only once to cut down on network traffic and increase run speed.
When the QSO input line in qso_locs.txt looks like:
-122.602921384713,37.9039783300392,-76.1392728835462,43.047770700162,,2023/03/22 00:51:00,339,141058.3,k2upd
(notice the extra comma before the date field that makes the line have an incorrect number of fields, (the correct number of fields at the moment is 8), no entry in rm_rnb_history_pres.csv is created. While this is desireable, the lack of error messages means some QSOs can go unrecorded if the output of the process is not carefully checked.
The script, qso_spot_kml.py should process all lines that it can, and then give a bad return code causing the github action to fail. That's what this line was intended to do. The line is not having that effect:
There was no error message in the above output.
When passed an address string from get_qrz_call_geo_address this method will return
"lat,lng"
where lat is the latitutde associated with the callsign and lng is the associated longitude.
Add 'labels' (there's a name for this) that hold information such as callsign, date and time, and signal strength. Also add a general label that describes the kml file as a whole.
They are being updated, but it's unclear if they're used for anything. If they are not, then rm them in the workflow
Summit to summit and park to park QSOs will be inaccurate if they contain the stations physical location rather than the park or summit.
Add two optional, trailing fields to the definition of qso_update.csv. If it finds these fields, the app map_qso.py will skip geocoding using qrz.com and the Google maps API, and simply use the two fields as is.
Stretch goal, check if the first field is not a floating point number, and then look up the lat/lng from either sota or pota based on the park or summit code found there.
JG0AWE caused an issue because it had no state. That led to the address construction code crashing on a None
reference. The issue was fixed by checking for the existance of each field and including the country code.
Add a feature so that when NOTE is encountered in the call field, the mapping app creates a note placemarker at the transmit location. This placemarker description will be supplied by the following line in qso_update.csv. The placemarker will have a timestamp after NOTE just like QSOs do. After the note description line, execution will resume as normal.
Create second workflow that only triggers on push with a .py filter, i.e.
on:
push:
paths:
- '**.js'
based on the docs
Then, remove the push:
trigger from scrape.yml
This will cut down on mapping API calls (which are billable) during QSO and RBN spot updates.
Because dates don't come with a year for now, they're being incorrectly set to 2023.
Next: Get first background set of data, fix the year, and then cut the search window moving forward. Think about whether or not this makes sense. (The fixes will need to be made to the json file that doesn't have years...)
Also consider storing the map file here. Then, the first set of fixes do make sense.
Also, figure out way to only place unique spots in the json files or at least the map file.
mail_qsl.py fails because a new 'description' line was added to qso_update.csv. Eventually, the initial line handling code should be removed to a method contained in auto_geo_vars.py since the first four lines of the file load the variables in that module.
Using the data in qso_updates.csv output lines formatted as:
call,date_time,rx_rst,tx_tst,Call\nName\naddr1\naddr2\n
Add featuer so maps can be overlaid on Google Earth by loadin in the KML file. Use the formatting examples at
http://dagik.org/kml_intro/E/line.html
as a guide
Add a feature to output POTA adif files based on the input lines in qso_update.csv.
Output lines should look like:
<station_callsign:6>KD0FNR<Call:4>N6PF<QSO_DATE:8>20230404<TIME_ON:4>1444<BAND:3>20M<MODE:2>CW<MY_SIG:4>POTA<MY_SIG_INFO:6>K-0647<eor>
use git log to create a list of versions like this:
git log --pretty="format:%H" rm_rnb_history.csv
Step through every version after the base version:
35f5461
for each revision:
use git diff redireted into a file to grep the lines that begin with a single +. Like this:
git diff 35f5461 cf9ece1 rm_rnb_history.csv > test.out
delete the plus (using a group regex like in nedit?) then, append the output to the a new map file.
cp the new map file to the old one. Store the revision.
Add a script to create kml maps of POTA expeditions. The inputs should be a date range, a location, and the output map name.
The output kml file should be placed in the maps directory of the repostiroy. (Look into auto-transferring to a google drive folder at some point.)
The following steps should do the trick:
This feature should feed (via a pipe) qso_kml.py which in turn uses qso_spot_kml.py .
A declarative, efficient, and flexible JavaScript library for building user interfaces.
๐ Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. ๐๐๐
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google โค๏ธ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.