codeforboston / voiceapp311 Goto Github PK
View Code? Open in Web Editor NEWVoice assistant connection to Boston services
License: MIT License
Voice assistant connection to Boston services
License: MIT License
Analyze Boston has a CSV file with what I believe is every address in the city (https://data.boston.gov/dataset/live-street-address-management-sam-addresses). This would be one way to test performance of our code and smoke out any bugs. Any idea on how to automate the checking of results? Also, there are 300,000+ addresses!
There is a way to have notifications built within Alexa and this would be a useful feature to have in our Alexa skill.
With one request the user can get all information about their street (street cleaning, trash, etc.). This combines multiple of our individual intents.
User should be able to ask if there is currently a snow emergency in the city.
There is no API for this information, but it is posted at http://www.boston.gov/snow as a banner at the top of this page. We should be able to parse this page. For development, lets see if we can see how this works from the page source as well as https://github.com/CityOfBoston/boston.gov
Dataset:
https://data.boston.gov/dataset/open-space/resource/769c0a21-9e35-48de-a7b0-2b7dfdefd35e
Find the nearest park/playground.
We have some old user text (Ex: "Thank you for using the TrashApp skill. ") and some responses that seem overly verbose, making using the skill slow. Before initial release, we should do a review pass on all of this text.
User should be able to ask service about any city emergencies or alerts.
The city of Boston posts city wide emergency alerts on https://www.boston.gov. This will appear as a banner at the top of the page. We should be able to figure out how to parse this info from the page source and looking through https://github.com/CityOfBoston/boston.gov
Add an intent that returns the street sweeping days for the street currently stored in the session.
The relevant API is here:
https://data.boston.gov/dataset/street-sweeping-schedules/resource/9fdbdcad-67c8-4b23-b6ec-861e77d56227
We should discuss what to output for this. Options include:
We also might want to consider whether or not we have an interface into the basic Alexa functionality so that we could do something like, "Remind me the day before the next street sweeping."
Some of the most common real time status updates have been provided by Boston at https://www.boston.gov/. These can be seen as icons with associated text towards the top of the page, with a minimum of 5 items, but sometimes more.
We should have an intent to get these statuses. This could be reading all statuses or you could ask for a specific status update.
There are two main words people use to interact with a skill:
We should look into whether we can determine from the initial request which one of those the user used to get started. Ideally we could keep the session going if it was opened with "open", and with "ask" we could close the session as soon as the question is answered.
Boston does leaf and yard waste pickup from April to December during specified weeks (Can be found here: https://www.boston.gov/departments/public-works/leaf-and-yard-waste-schedule). Might be useful to query this information from Alexa. Hopefully should be very similar to the existing recycling and trash pickup intent.
It looks like we are going to proceed with Python. We need to clean up the repo so we can continue adding functionality to the skeletal skill we have so far.
The Boston digital team stated that the trash day info at data.boston.gov is often out of date. They are currently using a 3rd party called ReCollect to manage the newest trash day info.
We should update our trash day to use the info from ReCollect. Check out https://www.boston.gov/trash-day-schedule and examine what calls they are making to the ReCollect API.
Find out crime incidents that happened on your street.
Related to #50, Alexa is not currently able to handle the two letter street names that I checked, A Street and B Street. At best, by elongating pronunciation of A Street, Alexa thinks I said "a Street" and cannot find the address. Elongating the pronunciation of B Street, Alexa thinks I said the abbreviation "B. Street" and provides incorrect information for the particular address at "B Street". try: 50 B Street, Alexa thinks T & R picked up Monday, but it is picked up on Friday. 110 P Street was identified correctly but the trash day was incorrect, Mon and Fri.
Suggest that we be able to tell Alexa "Letter A Street" or something like that.
Allow users to add a flash briefing to get the city status created in #43
Flash briefing API: https://developer.amazon.com/docs/flashbriefing/understand-the-flash-briefing-skill-api.html
We need a logo for display in the Alexa App.
One thing that we've seen a lot of is the Alexa platform failing to correctly interpret more difficult street names. For instance, when we tried Everdean St. none of us could make themselves understood to Alexa no matter how clearly we enunciated.
For these situations, it would be nice to support the ability to spell your street name if Alexa is having a hard time.
Something along the lines of:
"Alexa, my street number is ONE SIX ZERO ZERO and my street name is P - E - N - N - S - Y - L - V - A - N - I - A avenue."
Courtney noticed that we may be responding with inaccurate information on the trash day intent. We should look into this and fix if needed.
Hopefully at the very least we can set something up that uses the service simulator to check responses.
When a user requests something from Boston Data, such as trash day and hasn't provided an address, the resulting conversation is cumbersome:
User: "When is my trash day?"
Alexa: "I'm not sure what your address is...."
User: "My address is...."
Alex: "I now know your address..."
User: "....."
User: "When is my trash day?"
Alex: "Your trash day is...."
We should be more responsive if we have asked for additional information to fulfill a request. For example:
User: "When is my trash day?"
Alexa: "I'm not sure what your address is...."
User: "My address is...."
Alex: "Your trash day is...."
If this is a repeat of the issue mentioned with Tremont Street addresses, i apologize. The Trash Schedules by Address data (CSV) available from data.boston.gov sometimes contains duplicate records for what seems to be the same address, and it is not clear which days are correct. See for instance 50 Tremont St (T&R are Wednesday on one record and TH&TH for four other records, all 50 Tremont St) and 0 A St (F&F and TH&TH). No doubt there are others.
Turns out that though the README.md section Notes on open data sources for Boston. says we are using trash schedules by address CSV, the trash day intent is going out to https://recollect.net/api to get the information
Suggest we find a way to determine the correct records and do not upload the incorrect records from the data that is pulled into the skill.
Asking for alerts when there are no alerts gets a silent response.
Add logic to provide a response script when there are no alerts.
Moving permits are one of the causes of car towing in the city. Residents that are granted a moving permit have two days prior to the moving date to post the sign. In some cases, people that have their cars parked in one of those permit spots are not aware of the moving until their cars get towed (reasons include traveling and not using the car that frequently).
We can build a functionality that allows users to ask if there are existing moving permits on a certain street.
Moving permits are available at:
https://data.boston.gov/dataset/open-moving-truck-permits
Dataset can be used to alert user of work zones on a particular street.
Currently the only piece of information we request from the user is their address (for trash day and snow parking). Right now we force the user to trigger the set address intent (in other words, we are unable to get the address unless the user says something like, "My address is ...").
Amazon provides a dialog interface that should make this interaction less awkward:
https://developer.amazon.com/docs/custom-skills/dialog-interface-reference.html
Use the dialog model so that the interaction can go something like this:
user: ask my city [intent that requires address]
alexa: please tell me your address
user: 1 main street
alexa: [response based on address]
Train schedules can change based on the season, day of week, and holidays. This service will provide users departure and arrival times (and other relevant info) for any commuter rail trip in the Boston area.
MBTA’s API contains this information. It can be found here:
https://mbta.com/developers
The user will specify a departure and arrival location, and the service will return the next 2 train departure and arrival times. Optionally, it will let the user add a time & date to the query. If there’s currently a delay this service will notify the user and provide the expected times of departure & arrival (as calculated and provided by MBTA’s API). It can also provide fare information.
Once we build up a few skills, we will need to publish the skill. We need to research this process and discuss details with the CfB leadership team.
The data provided at data.boston.gov is not always up to date with correct parking locations. In addition, the parking lots often fill up so we wouldn't want to direct users to these parking lots.
We can get this real time parking data from the following ArcGIS feed:
https://services.arcgis.com/sFnw0xNflSi8J0uh/arcgis/rest/services/SnowParking/FeatureServer/0
The city of Boston is using this information at https://www.boston.gov/departments/311/snow-emergency-parking
User should be able to ask what the latest N reports to 311 are.
We can get this info from https://311.boston.gov/. We can also look into using Open 311
We're tracing what our code is doing during execution of an intent using print statements. Is there a reason we can't move these statements to a logger?
def execute_request(self, mycity_request):
"""
Route the incoming request based on type (LaunchRequest, IntentRequest,
etc.) The JSON body of the request is provided in the event parameter.
"""
print(
self.LOG_CLASS,
'[method: main]',
'MyCityRequestDataModel received:\n',
str(mycity_request)
)
There's code that's duplicated across our current intents (setting the title card, reprompt text). We could push that code into an IntentParent class and each subclass have its own logic for setting output_speech. Would people find this more confusing than how we're writing intents now?
SSML stands for Speech Synthesis Markup Language and allows us to do stuff like make Alexa whisper, pause, etc.
Explore what we'd need to add to use this and whether or not there is an analog on other platforms.
https://developer.amazon.com/docs/custom-skills/speech-synthesis-markup-language-ssml-reference.html
We know that these built-in intents are important to stop Alexa from looping and trying to process everything she hears as an address, but we need a better understanding of skill's workflow.
The stop intent is straightforward, but we should have a better idea of how to use cancel so that we can write a separate handler for it in lambda_function.py.
For some intents the output speech can become too verbose. Instead of packing all the information that user might want into one output speech, we can give the most important information first and then prompt the user if they would like the remaining info.
Example:
Voice Assistant: "The nearest parking lot is at 115 Adams St. Would you like to know more?"
Me: "Yes"
VA: "There is a fee of $1,000 per day."
One of the most visited pages on the City of Boston's website is the Food trucks page at https://www.boston.gov/departments/small-business-development/city-boston-food-trucks-schedule.
Add an Alexa intent which allows users who have already provided their address to ask questions such as:
Sample Utterances:
GetTruckList near {address}
GetTruckLocations given {truckname}
Possible future functionality:
GetTruckTweets
Narrow vertical stripe of functionality v0:
ASK: What are the food trucks nearest to me.
RETURN: Return food trucks at location nearest to you. Assume today. Assume lunch.
Code for intents that have the goal of finding the closest X have a common flow:
get information about X from some web resource
process information from resource into expected format
query Google Maps for the closest X to origin address
format mycity_response.output_speech
I was thinking an abstract Finder object with subclasses FinderCSV and FinderGIS might be easier for us to write new intents that are location focused. These classes can use the utils modules in utilities to do most of the work but the Finder objects could make our intent code more readable.
Here's a quick and dirty UML diagram:
Alexa responses that it doesn't understand the request.
Alexa thinks the random speach is your new address. In the above example, Alexa responds "I now know your address is blah blah blah."
When running deploy tools on Windows machines, an error is raised due to being unable to remove the created temp directory.
Creating temporary build directory ... Traceback (most recent call last):
File "deploy_tools.py", line 136, in
main()
File "deploy_tools.py", line 130, in main
package_lambda_function()
File "deploy_tools.py", line 101, in package_lambda_function
os.mkdir(TEMP_DIR_PATH)
FileExistsError: [WinError 183] Cannot create a file when that file already exists: 'C:\CodeForBoston\voiceapp311\mycity\mycity\deploy_tools\..\..\temp'
There are several doc comments throughout the project that are either incomplete, incorrect, or missing altogether. Someone should go through the project and review/update the doc comments. Any questions about specific functions or files can go to the slack channel.
Should be able to ask Alexa when the street cleaning is coming in my neighborhood
User should be able to provide their address in an information request. For example if the user's first question is the following, it results is Boston Data asking for the address:
"When is trash day for 10 Tremont St.?"
This can be done with slots provided with the intent.
The string "9.0.3" is greater than the string "10.0.0" when compared lexicographically.
We need to do a conversion.
@SKalmane has a fix:
major, minor, patch = [int(x, 10) for x in version.split('.')];
if major >= 10:
_install_pip_dependencies_from_script()
In order to easily write intents that work with multiple voice platforms, we will implement a custom data format that is able to represent requests from and responses to a variety of services.
We'll use this format in our main request handler to receive input from these services, build responses in one place, and send responses in the correct format.
We currently don't have a proper way to test our alexa skill. Some people use flask-ask for this very feature. Someone should explore the implementation details in using this functionality.
per @jmartini
We should come up with more clear naming then mcd for instances of the MyCityData object. PEP8 discourages abbreviations and this doesn't really tell the user what they are supposed to be passing in. Lets brainstorm on this.
dataset:
https://data.boston.gov/dataset/snow-emergency-parking
Potentially usable for snow emergency notification.
We currently have a nice deploy script for Alexa, but it does not make use of the ASK-CLI.
We should modify this tool to use ASK-CLI and make it extensible for use with Google's CLI.
This will also involve some directory restructuring.
A declarative, efficient, and flexible JavaScript library for building user interfaces.
🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.
TypeScript is a superset of JavaScript that compiles to clean JavaScript output.
An Open Source Machine Learning Framework for Everyone
The Web framework for perfectionists with deadlines.
A PHP framework for web artisans
Bring data to life with SVG, Canvas and HTML. 📊📈🎉
JavaScript (JS) is a lightweight interpreted programming language with first-class functions.
Some thing interesting about web. New door for the world.
A server is a program made to process requests and deliver data to clients.
Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.
Some thing interesting about visualization, use data art
Some thing interesting about game, make everyone happy.
We are working to build community through open source technology. NB: members must have two-factor auth.
Open source projects and samples from Microsoft.
Google ❤️ Open Source for everyone.
Alibaba Open Source for everyone
Data-Driven Documents codes.
China tencent open source team.