Code Monkey home page Code Monkey logo

rm-rbn-history's People

Contributors

arhonah avatar hcarter333 avatar

Watchers

 avatar  avatar

rm-rbn-history's Issues

Implement flow to update QSOs from github

Modify the actions for this project to utilize auto_geo_update to automatically create kml maps. The advantage of this will be that the mapping data will be revision controlled here. Also, revision control the output maps in a folder named maps.

Add automatic geolocation updates for new QSOs

Create a python script that adds new QSO information, (including and especially the station's geographic information), to rm_rnb_history_pres.csv. The script will require the following input:

Command line:

  • callsign
  • GMT (date and time formatted as YYYY/MM/DD HH:MM:SS
  • QTH of rx station (the ops QTH, KD0FNR in this example)
  • rx RST
  • tx RST
  • QRZ password

The program will then write QSOs to rm_rnb_history_pres.csv in the standard format:

id tx_lng tx_lat rx_lng rx_lat timestamp dB frequency Spotter

Where the dB field will be the rx RST.

rm_rnb_history_pres.csv will then be automatically pushed to the repository. Use the flow found in scrape.yml to update the repository.

This project will be divided into the following phases which will each get their own issue:

  • get_call_geo_loc.py Returns a row for ... complete these later

Add input error handling to kml creation script

Add a feature that determines if input is invalid. For now there are two kinds of invalid that should be detected:

  1. Incorrect number of fields in an input line.
  2. space following a comma

Add the feature to qso_spot_kml.py. Add test cases for each of the above. Use this feature to create a tests directory in the project.

Feature to add QSOs to spot data

We need an easy way for people to add other QSOs in addition to spots. This will be done with the following steps:

  1. Move location to its own .py file for multi-script access
  2. Add qso_locs.txt file to the repository. It will have the format: sta_lat,sta_lng,rx_lat,rx_lng,yyyy/mm/dd hh:mm:ss,s(of rst),freq(kHz),rx_call
  3. Add new python script add_qso.py that interprets the csv format in 2 and dumps out properly formatted lines that can be appended of rm_rnb_history_pres.csv.
  4. On each update action, run qso_locs.txt through add_qso.py, then remove qso_locs.txt. The upate at the end of the action scxript should bring back the original empty file. Like:
add_qso.py qso_locs.txt >>  rm_rnb_history_pres.csv
rm qso_locs.txt

The key field will be assigned a random 32 bit number.

kd0r now a known alias

Widen the search criteria back up to 'kd0*' then add filters that search through RBN spots that match kd0* based on the known the frequencies (about 8 of them) the Rockmite has been spotted on.

QSOs duplicated due to input errors

At the moment, map_qso.py -hh is piped into rm_rnb_history_pres.csv if the script encounters an input error after one or more QSOs have been successfully processed, then when the user corrects the error the earlier, successful QSOs are duplicated.

Proposed fix: pipe the output of map_qso.py -hh into an intermediate file. In the next step, (which won't be reached unlss map_qso.py -hh completes successfully), append, (via cat), the intermediate file onto the end of rm_rnb_histor_pres.csv. In this manner the QSOs will only be added once.

Add signal dB, receiving call sign and frequency

Add an additional two columns to the kepler.gl station spot csv data file. The first is to track the strength of the received signal from the Rockmite. The second is to label the stations that spotted the rockmite.

Unify qso and spotter data

Unify the process for handling qso and spotter data.

Perhaps do this by making both kml and geojson output into tools that can be chosen instead of being arbitrarily executed.

The existing flow could be left in place and data simply stored into a separate file as described in #12 .

Beginning and End time as optional args to map_qso.py

Because sometimes you only make one QSO.

Add the expe_kml.py arguments -b and -e as optional arguments to map_qso.py. This will allow the user to specify what time range around a QSO or group of QSOs to include RBN spots from. Also include a -rhh (range: +/- half hr) option to make the range one hour on either side of the QSOs.

Develop a faster QSO mapping routine

Use code that looks like this

map_addr = 'http://maps.google.com/maps/api/staticmap?size=85x85&maptype=hybrid' + 
        '&markers=color:blue|label:T|' + tx_latlng.lat() +  ',' + tx_latlng.lng() + 
        '&markers=color:blue|label:R|' + rx_latlng.lat() +  ',' + rx_latlng.lng() + 
        '&path=color:0xff0000ff|weight:5|' + tx_latlng.lat() +  ',' + tx_latlng.lng() + '|' + 
        rx_latlng.lat() +  ',' + rx_latlng.lng() + '&sensor=false';

Create method to return mailing address corresponding to callsign

Create a python method that given a QRZ.com data session id and a valid ham radio callsign, returns the mailing address,(organized with + signs as separators as detailed here ), associated with that callsign. (Don't sweat the stuff about encoding the plus sign. The call to requests.get dos that for you.)

The geolocation request needs an address formatted like this example:
https://maps.googleapis.com/maps/api/geocode/json?address=1600+Amphitheatre+Parkway,+Mountain+View,+CA&key=YOUR_API_KEY

Refine date_time failures in expe_kml_defs.py

The date parsing code that reads the file, not the code that used to read the command line, isn't issuing error messages, it's just stopping the entire script. Actually, it should issue a message, and then stop the entire script. No data left behind.

Code located in expe_kml_defs.py

Missing spots in rm_rnb_history_pres.csv file

The most recent spot:

 "1522481322": [
    "K5TR",
    "14057.7",
    "KD0FNR",
    3,
    19,
    "0005z 28 Jan"
  ]

should result in a row with K5TR and a time of 16:05. However, the latest entry in the current version of rm_rnb_history_pres.csv is at a time of 15:47 on today 1/27/23.
I'm going to delete all entries from the history file after 15:47 to see if the issue will fix itself.

When QSO input has incorrect commas, no error is fired

When the QSO input line in qso_locs.txt looks like:
-122.602921384713,37.9039783300392,-76.1392728835462,43.047770700162,,2023/03/22 00:51:00,339,141058.3,k2upd
(notice the extra comma before the date field that makes the line have an incorrect number of fields, (the correct number of fields at the moment is 8), no entry in rm_rnb_history_pres.csv is created. While this is desireable, the lack of error messages means some QSOs can go unrecorded if the output of the process is not carefully checked.

The script, qso_spot_kml.py should process all lines that it can, and then give a bad return code causing the github action to fail. That's what this line was intended to do. The line is not having that effect:
image

There was no error message in the above output.

Add labels to kml files

Add 'labels' (there's a name for this) that hold information such as callsign, date and time, and signal strength. Also add a general label that describes the kml file as a whole.

define qso_update.csv tailing fields for lat/lng of s2s, p2p

Summit to summit and park to park QSOs will be inaccurate if they contain the stations physical location rather than the park or summit.

Add two optional, trailing fields to the definition of qso_update.csv. If it finds these fields, the app map_qso.py will skip geocoding using qrz.com and the Google maps API, and simply use the two fields as is.

Stretch goal, check if the first field is not a floating point number, and then look up the lat/lng from either sota or pota based on the park or summit code found there.

QRZ.com lookup has no state for some countries

JG0AWE caused an issue because it had no state. That led to the address construction code crashing on a None reference. The issue was fixed by checking for the existance of each field and including the country code.

Add chronolgical notes to log

Add a feature so that when NOTE is encountered in the call field, the mapping app creates a note placemarker at the transmit location. This placemarker description will be supplied by the following line in qso_update.csv. The placemarker will have a timestamp after NOTE just like QSOs do. After the note description line, execution will resume as normal.

Create second workflow to reduce map api calls

Create second workflow that only triggers on push with a .py filter, i.e.

on:
  push:
    paths:
      - '**.js'

based on the docs

Then, remove the push: trigger from scrape.yml

This will cut down on mapping API calls (which are billable) during QSO and RBN spot updates.

Fix dates from 2022

Because dates don't come with a year for now, they're being incorrectly set to 2023.
Next: Get first background set of data, fix the year, and then cut the search window moving forward. Think about whether or not this makes sense. (The fixes will need to be made to the json file that doesn't have years...)
Also consider storing the map file here. Then, the first set of fixes do make sense.
Also, figure out way to only place unique spots in the json files or at least the map file.

mail_qsl.py fails due to new description line

mail_qsl.py fails because a new 'description' line was added to qso_update.csv. Eventually, the initial line handling code should be removed to a method contained in auto_geo_vars.py since the first four lines of the file load the variables in that module.

Script to output POTA adif files

Add a feature to output POTA adif files based on the input lines in qso_update.csv.

Output lines should look like:
<station_callsign:6>KD0FNR<Call:4>N6PF<QSO_DATE:8>20230404<TIME_ON:4>1444<BAND:3>20M<MODE:2>CW<MY_SIG:4>POTA<MY_SIG_INFO:6>K-0647<eor>

Accumulate new map using added lines in git diff

use git log to create a list of versions like this:
git log --pretty="format:%H" rm_rnb_history.csv
Step through every version after the base version:
35f5461

for each revision:
use git diff redireted into a file to grep the lines that begin with a single +. Like this:
git diff 35f5461 cf9ece1 rm_rnb_history.csv > test.out
delete the plus (using a group regex like in nedit?) then, append the output to the a new map file.

cp the new map file to the old one. Store the revision.

Feature to create kml maps based on date range and station location

Add a script to create kml maps of POTA expeditions. The inputs should be a date range, a location, and the output map name.

The output kml file should be placed in the maps directory of the repostiroy. (Look into auto-transferring to a google drive folder at some point.)

The following steps should do the trick:

  1. Pull in lines from rm_rnb_history_pres.csv.
  2. Use the date/time field to compare to "" enclosed inputs on the command line. (inputs section DONE)
  3. Filter for lines that match the range
  4. Ignore the key field.
  5. Substitute the home station location with the location on the command line

This feature should feed (via a pipe) qso_kml.py which in turn uses qso_spot_kml.py .

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.