Code Monkey home page Code Monkey logo

hassio-google-drive-backup's Introduction

Home Assistant Google Drive Backup

screenshot

About

A complete and easy way to back up Home Assistant to Google Drive.

This is for you if you want to quickly set up a backup strategy without much fuss. It doesn't require much familiarity with Home Assistant, its architecture, or Google Drive. Detailed install instructions are provided below but you can just add this repo, click install and open the Web UI. It will tell you what to do and only takes a few simple clicks. Detailed install instructions are below if that doesn't seem clear.

Features Overview

  • Creates backups on a configurable schedule.
  • Uploads backups to Drive, even the ones it didn't create.
  • Clean up old backups in Home Assistant and Google Drive, so you don't run out of space.
  • Lots of options for customization, but never requires you to write a yaml file.
  • Restore from a fresh install or recover quickly from disaster by uploading your backups directly from Google Drive.
  • Integrates with Home Assistant Notifications and provides sensors you can trigger off of.
  • Notifies you when something goes wrong with your backups.
  • Super easy installation and configuration.
  • Privacy-centric design philosophy.
  • Comprehensive documentation.
  • Most certainly doesn't mine bitcoin on your home automation server. Definitely no. Or does it?

The Upsell

This addon has been featured by %YOUR_FAVORITE_HA_YOUTUBER% and is often listed as an essential addon when starting with Home Assistant. Here are some videos about it from others if you'd like to get an idea of what using it looks like or what the community thinks:

This project requires financial support to make the Google Drive integration work, but it is free for you to use. You can join those helping to keep the lights on at:

Detailed Install Instructions

  1. Navigate in your Home Assistant frontend to Settings -> Add-ons -> Add-on Store (Bottom Right).

  2. Click the 3-dots menu at upper right ... > Repositories and add this repository's URL: https://github.com/sabeechen/hassio-google-drive-backup

  3. Reload the page , scroll to the bottom to find the new repository, and click the new add-on named "Home Assistant Google Drive Backup":

    Note: Home Assistant loads the repository in the background and the new item won't always show up automatically. You might need to wait a few seconds and then "hard refesh" the page for it to show up. On most browser the keyboard shortcut for this is CTRL+F5. If it still doesn't show up, clear your browser's cache and it should then.

  4. Click Install and give it a few minutes to finish downloading.

  5. Click Start, give it a few seconds to spin up, and then click the Open Web UI button that appears.

  6. The "Getting Started" page will tell you how many backups you have and what it will do with them once you connect it to Google Drive. You can click Settings to change those options through the add-on (which is the recommended way, they take effect immediately), or update them from the page where you installed the add-on as described below (also works, restart for them to take effect).

  7. Click the Authenticate with Drive button to link the add-on with your Google Drive account. Alternatively, you can generate your own Google API credentials, though the process is not simple.

  8. You should be redirected automatically to the backup status page. Here you can make a new backups, see the progress of uploading to Google Drive, etc. You're done!

Configuration

After you start the addon you have an opportunity to review your settings within the addon's Web-UI before you connect it to Google Drive. It is recommended to modify the setting this way because the UI makes it easy and explains what each option does.

If you'd still prefer use edit your setting in yaml or through the supervisor, the list of configurable options with explanations is available here.

FAQ

Is this for me?

Most likely, yes. This addon is focused on making backup simple, reliable, easy to understand, and well supported. It provides clear error messages when things go wrong and explains how you fix it. It has a fancy-pants web interface you can look at to see how things are going. To do that it sacrafices customizability. It can't:

  • Create backups more than once a day.
  • Create backups only when your configuration changes.
  • Upload somewhere other than Google Drive.
  • Be customized outside of what the settings allow.

If you want a backup strategy highly customized to your needs, you might be better off hacking something together with automations and the samba addon, for example. This project started out as me doing exactly that for myself, and now its grown into a mature project with ~100k people using it all over the world. Weird. I never thought I'd be getting community pressure to translate the UI into portugese, but here I am dealing with those kinds of problems now.

How will I know this will be there when I need it?

Home Assistant is notorious for failing silently, and your backups aren't something you want to find is broken after an erroneous comma makes you unable to turn on any of the lights in your house. That's why I've added some functionality to keep you informed if things start to break. If the add-on runs into trouble and gets more than 12 hours behind its schedule, you'll know in two ways:

  • Notifications in Home Assistant UI

  • A binary_sensor you can use to trigger additional actions.

Redundancy is the foundation of reliability. With local backups, Google Drive's backups, and two flavors of notification I think you're covered.

How do I restore a backup?

The backups this addon creates are the same backups that Home Assistant makes by itself and can be restored using any of the methods documented elsewhere. Here are few pointers to get you started.

  • If you can still get to the addon's web-UI then select the backup and click "Load into Home Assistant" have it copied back into Home Assistant.
  • If not (eg, maybe your hard drive died and you're starting over):
    • Download one of the backups you've previously created from Google Drive.
    • On whatever hardware you're using to run Home Assistant now, follow the normal instructions to install Home Assistant.
    • Once it's running (but before you create a user), click the link on the Home Assistant setup page that says "Alternatively you can restore from a previous backup" and upload the backup you downloaded from Google Drive.
  • If you've got a backup that you'd like to restore to an already set up Home Assistant instance that doesn't already have this addon installed, you'll need to use something like the Samba Addon to copy a backup downloaded from Google Drive into the /backup folder.

I never look at HA notifications. Can I show information about backups in my Home Assistant Interface?

The add-on creates a few sensors that show the status of backups that you could trigger automations off of. binary_sensor.backups_stale becomes true when the add-on has trouble backing up or creating backups. For example, the Lovelace card below only shows up in the UI when backups go stale:

Lovelace Card

type: conditional
conditions:
  - entity: binary_sensor.backups_stale
    state_not: "off"
card:
  type: markdown
  content: >-
    Backups are stale! Please visit the "Home Assistant Google Drive Backup" add-on
    status page for details.
  title: Stale Backups!`

Mobile Notifications

If you have android or iOS, other notifications set up, this automation would let you know if things go stale:

    - alias: Backups went stale
      id: 'backups_went_stale'
      trigger:
      - platform: state
        entity_id: binary_sensor.backups_stale
        from: 'off'
        to: 'on'
      condition: []
      action:
      - service: notify.android
        data:
          title: Backups are Stale
          message: Please visit the 'Home Assistant Google Drive Backup ' add-on status page
            for details.

You could automate anything off of this binary sensor. The add-on also exposes a sensor sensor.backup_state that exposes the details of each backup. I'm working on a custom Lovelace component to expose that information.

Can I specify what time of day backups should be created?

You can add "backup_time_of_day": "13:00" to your add-on configuration to make backups always happen at 1 pm. Specify the time in the 24-hour format of "HH:MM". When unspecified, the next backup will be created (roughly) at the same time of day as the last one.

Can I keep older backups for longer?

This is just an overview of how to keep older backups longer. See here for a more in-depth explanation.

The add-on can be configured to keep generational backups on daily, weekly, monthly, and yearly intervals instead of just deleting the oldest backup. This can be useful if, for example, you've made an erroneous change but haven't noticed for several days and all the backups before the change are gone. With a configuration setting like this...

generational_days: 3
generational_weeks: 4
generational_months: 12
generational_years: 5

... a backup will be kept for the last 3 days, the last 4 weeks, the last 12 months, and the last 5 years. Additionally, you may configure the day of the week, day of the month, and day of the year that weekly, monthly, and yearly backups are maintained.

generational_days: 3

generational_weeks: 4
generational_day_of_week: "mon" # Can be 'mon', 'tue', 'wed', 'thu', 'fri', 'sat' or 'sun' (defaults to 'mon')

generational_months: 12
generational_day_of_month: 1 # Can be 1 through 31 (defaults to 1)

generational_years: 5
generational_day_of_year: 1 # can be 1 through 365 (defaults to 1)
  • Any combination of days, weeks, months, and years can be used. They all default to 0.
  • It's highly recommended to set 'days_between_backups: 1' to ensure a backup is available for each day.
  • Ensure you've set max_backups_in_drive appropriately high to keep enough backups (24 in the example above).
  • Once this option is enabled, it may take several days or weeks to see older backups get cleaned up. Old backups will only get deleted when the number present exceeds max_backups_in_drive or max_backups_in_ha

I already have something that creates backups on a schedule. Can I use this just to backup to Google Drive?

If you set 'days_between_backups: 0', then the add-on won't try to create new backups but will still upload up any it finds to Google Drive and clean up old backups in both Home Assistant and Google Drive. This can be useful if you already have for example an automation that creates backups on a schedule.

Can I give backups a different name?

The config option backup_name can be changed to give backups a different name or with a date format of your choosing. The default is {type} Backup {year}-{month}-{day} {hr24}:{min}:{sec}, which makes backups with a name like Full Backup 2021-10-31 14:00:00. Using the settings menu in the Web UI, you can see a preview of what a backup name will look like but you can also set it in the add-on's options. Below is the list of variables you can add to modify the name to your liking.

  • {type}: The type of backup, either 'Full' or 'Partial'
  • {year}: Year in 4 digit format (eg 2)
  • {year_short}: Year in 2 digit format (eg 19)
  • {weekday}: Long day of the week (eg Monday, ..., Sunday)
  • {weekday_short}: Short day of week (eg Mon, ... Sun)
  • {month}: 2 digit month (eg 01, ... 12)
  • {month_long}: Month long name (January, ... , December)
  • {month_short}: Month long name (Jan, ... , Dec)
  • {ms}: Milliseconds (001, ..., 999)
  • {day}: Day of the month (01, ..., 31)
  • {hr24}: 2 digit hour of the day (0, ..., 24)
  • {hr12}: 2 digit hour of the day (0, ..., 12)
  • {min}: 2 digit minute of the hour (0, ..., 59)
  • {sec}: 2 digit second of the minute (0, ..., 59)
  • {ampm}: am or pm, depending on the time of day
  • {version_ha}, Home Assistant version string (eg 0.91.3)
  • {version_hassos}: HassOS version string (eg 0.2.15)
  • {version_super}: , Supervisor version string (eg 1.2.19)
  • {date}: Locale aware date (eg 2023/01/01).
  • {time}: Locale aware time (eg 02:03:04 am)
  • {datetime}: Locale-aware datetime string
  • {isotime}: Date and time in ISO format
  • {hostname}: The Home Assistant machine's hostname

Will this ever upload to Dropbox/OnDrive/FTP/SMB/MyFavoriteProtocol?

Most likely no. I started this project to solve a specific problem I had, storing backups in a redundant cloud provider without having to write a bunch of buggy logic and automations. It might seem like a small change to make this work with another cloud provider, but trust me. I wrote this version of it, and it's not a simple change. I don't have the time to do it.

But Google reads my emails!

Maybe. You can encrypt your backups by giving a password in the add-on's options.

Does this store any personal information?

On a matter of principle, I only keep track of and store information necessary for the add-on to function. To the best of my knowledge the scope of this is:

  • You can opt-in to sending error reports from the add-on sent to a database maintained by me. This includes the full text of the error's stack trace, the error message, and the version of the add-on you're running. This helps notice problems with new releases but leaving it off (the default unless you turn it on) doesn't affect the functionality of the add-on in any way.
  • Once authenticated with Google, your Google credentials are only stored locally on your Home Assistant instance. This isn't your actual username and password, only an opaque token returned from Google used to verify that you previously gave the Add-on permission to access your Google Drive. Your password is never seen by me or the add-on. You can read more about how authentication with Google is accomplished here.
  • The add-on has access to the files in Google Drive it created, which is the 'Home Assistant Backups' folder and any backups it uploads. See the https://www.googleapis.com/auth/drive.file scope in the Drive REST API v3 Documentation for details, this is the only scope the add-on requests for your account.
  • Google stores a history of information about the number of requests, number of errors, and latency of requests made by this Add-on and makes a graph of that visible to me. This is needed because Google only gives me a certain quota for requests shared between all users of the add-on, so I need to be aware if someone is abusing it.
  • The Add-on is distributed as a Docker container hosted on Docker Hub, which is how almost all add-ons work. Docker keeps track of how many people have requested an image and makes that information publicly visible.

This invariably means that I have a very limited ability to see how many people are using the add-on or if it is functioning well. If you do like it, feel free to shoot me an email at [email protected] or star this repo on GitHub, it helps keep me motivated. If you run into problems or think a new feature would be nice, file an issue on GitHub.

Can I use my own Google API information to authenticate instead of yours?

On the first "Getting Started" page of the add-on underneath the "Authenticate with Google Drive" button is a link that lets you enter your own Client Id and Client Sercet to authenticate with Google Drive. You can get back to that page by going to "Actions" -> "Reauthorize Google Drive" from the add-on's web UI if you've already connected it previously. Instructions are also provided for those who are unfamiliar with the process, it's tedious to complete but ensures the add-on's communication is only between you and Google Drive.

Can I permanently save a backup so it doesn't get cleaned up?

Select "Never Delete" from the menu next to a backup in the add-on's Web UI. You can choose to keep it from being deleted in Home Assistant or Google Drive. When you do this, the backups will no longer count against the maximum number of backups allowed in Google Drive or Home Assistant. Alternatively, you can move a backup in Google Drive out of the backup folder. the add-on will ignore any files that aren't in the backup folder. Just don't move them back in accidentally since they'll get "cleaned up" like any old backup after a while :)

What do I do if I've found an error?

If the add-on runs into trouble and can't back up, you should see a big red box with the text of the error on the status webpage. This should include a link to pre-populate a new issue in GitHub, which I'd encourage you to do. Additionally, you can set the add-on config option "verbose": true to get information from the add-on's logs to help me with debugging.

Will this fill up my Google Drive? Why are my backups so big?

You'll need to take care to ensure you don't configure this to blow up your Google Drive. You might want to consider:

  • If your backups are HUGE, it's probably because Home Assistant by default keeps a long sensor history. Consider setting purge_keep_days: N in your recorder configuration to trim it down to something more manageable, like 1 day of history.
  • Some other add-ons are designed to manage large amounts of media. For example, add-ons like the Plex Media Server are designed to store media in the /share folder, and Mobile Upload folders default to a sub-folder in the addons folder. If you migrate all of your media to the Home Assistant folder structure and you don't exclude it from the backup, you could easily chew up your entire Google Drive space in a single backup.
  • If you use the Google Drive Desktop sync client, you'll probably want to tell it not to sync this folder (it's available in the options).

I want my backups to sync to my Desktop computer too

That's not a question but you can use Google Drive Backup & Sync to download anything in your Google Drive to your desktop/laptop automatically.

I configured this to only keep 4 backups in Drive and Home Assistant, but sometimes I can see there are 5?

The add-on will only delete an old backup if a new one exists to replace it, so it will create a 5th one before deleting the first. This is a reliability/disk usage compromise that favors reliability because otherwise, it would have to delete an old backup (leaving only 3) before it could guarantee the 4th one exists.

Can I exclude specific sub-folders from my backup?

The add-on uses the supervisor to create backups, and the supervisor only permits you to include or exclude the 5 main folders (home assistant configuration, share, SSL, media, and local add-ons). Excluding specific subfolders, or only including specific subfolders from a backup isn't possible today.

I'm getting weird errors. Where do I look for more detils about an error (Supervisor logs).

The addon uses Home Assistant's "supervisor" to create and delete backups on Home Asisstant's side. In case you don't know, the supervisor is something that runs in the background on Home Assistant and manages stuff like backups, connections to hardware, and setting up the environment that Home Assistant Core (eg the UI) and addons run in. Because of this a lot of errors you run into (problems with the NAS, HD corruption, etc) only show up in the supervisor's logs. The supervisor's logs are kind of hidden by default, to view them:

  • Go to your Home Assistant user profile by clicking the user icon in the bottom left of Home Assistant's main UI.
  • Enable "Advanced Mode" in your profile.
  • Navigate to Settings > System > Logs
  • Select "Supervisor" from the drop down at the top right of the page.

The logs there keep a pretty short history, so if you ahve a lot of other errors/warnings happening (which is common) you might need to go check the logs right after you see errors in the addon.

hassio-google-drive-backup's People

Contributors

agusalex avatar archef2000 avatar chrismcneil avatar cogneato avatar dependabot[bot] avatar enzokot avatar ericmatte avatar freginedevices avatar frenck avatar iiaironwolf avatar jcgoette avatar liju09 avatar lucadiba avatar ludeeus avatar markvader avatar meichthys avatar misaligar avatar misiu avatar mivano avatar petrkotek avatar philclifford avatar pizzakid25 avatar raoulteeuwen avatar reharmsen avatar rubenkelevra avatar sabeechen avatar sbressler avatar scriptist avatar tefinger avatar unformatt avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

hassio-google-drive-backup's Issues

Secret for password?

Would it be possible to specify the password in secrets.yaml and use !secret <name of secret> instead of plain text in the config?

Perhaps nothing is gained by that, but I would like to keep credentials in one place if possible.

Also, would it be possible to exclude a file from backup, such as said secrets.yaml?

requests.exceptions.ConnectionError: ('Connection aborted.', timeout('The write operation timed out',))

When trying to upload backup it stops on 21% and I get the following error

My configuration
{
"max_snapshots_in_hassio": 3,
"max_snapshots_in_google_drive": 8,
"days_between_snapshots": 7,
"use_ssl": false,
"drive_experimental": true,
"ignore_ipv6_addresses": true,
"drive_ipv4": "172.217.1.202",
"send_error_reports": true,
"snapshot_time_of_day": "22:30"
}

Traceback (most recent call last):
File "/app/backup/engine.py", line 162, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 443, in _checkForBackup
self.drive.saveSnapshot(to_backup, self.hassio.downloadUrl(to_backup), self.folder_id)
File "/app/backup/drive.py", line 92, in saveSnapshot
for progress in self.drivebackend.create(stream, file_metadata, MIME_TYPE):
File "/app/backup/driverequests.py", line 181, in create
partial = self.retryRequest("PUT", location, headers=headers, data=data)
File "/app/backup/driverequests.py", line 216, in retryRequest
response = request(method, url, headers=send_headers, json=json, timeout=(30, 30), data=data, stream=stream)
File "/usr/lib/python3.6/site-packages/requests/api.py", line 60, in request
return session.request(method=method, url=url, **kwargs)
File "/usr/lib/python3.6/site-packages/requests/sessions.py", line 533, in request
resp = self.send(prep, **send_kwargs)
File "/usr/lib/python3.6/site-packages/requests/sessions.py", line 646, in send
r = adapter.send(request, **kwargs)
File "/usr/lib/python3.6/site-packages/requests/adapters.py", line 498, in send
raise ConnectionError(err, request=request)
requests.exceptions.ConnectionError: ('Connection aborted.', timeout('The write operation timed out',))

Secrets password file

Hi, there is an option for exclude folders, add-on, but i don't understand if i can exlude a file (like secret_password).
Thanks

MemoryError

Please add info about your configuration here, along with a brief description of what you were doing and what happened. Detail is always helpful for investigating an error. You can enable verbos logging by setting {"verbose": true} in your add-on configuration and including that here. :

Traceback (most recent call last):
File "/app/backup/engine.py", line 92, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 312, in _checkForBackup
self.drive.saveSnapshot(to_backup, self.hassio.downloadUrl(to_backup), self.folder_id)
File "/app/backup/drive.py", line 96, in saveSnapshot
media: MediaIoBaseUpload = MediaIoBaseUpload(stream, mimetype='application/tar', chunksize=262144, resumable=True)
File "/usr/lib/python3.6/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/googleapiclient/http.py", line 439, in init
self._fd.seek(0, os.SEEK_END)
File "/app/backup/responsestream.py", line 43, in seek
self._load_all()
File "/app/backup/responsestream.py", line 17, in _load_all
self._bytes.write(chunk)
MemoryError

Can't start addon

Hello! I instal this addon in my Hass.io. I configured the addon, but I can not run it
{
"max_snapshots_in_hassio": 4,
"max_snapshots_in_google_drive": 15,
"days_between_snapshots": 1,
"use_ssl": false,
"snapshot_time_of_day": "03:00"
}
Error is:
not a valid value for dictionary value @ data['options']. Got {'max_snapshots_in_hassio': 4, 'max_snapshots_in_google_drive': 15, 'days_between_snapshots': 1, 'use_ssl': False, 'snapshot_time_of_day': '03:00'}

Hassio system log full of pinging the binary sensor every 10 seconds.

Does it really need to be hitting the sensor so often? Can this be fixed, or can I disable the sensor?

19-04-03 01:06:28 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:06:38 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:06:48 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:06:58 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:07:08 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:07:18 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:07:28 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:07:38 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:07:48 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:07:58 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:08:08 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:08:18 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:08:28 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:08:38 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:08:48 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:08:58 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:09:08 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:09:18 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:09:28 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:09:38 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:09:48 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:09:58 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:10:08 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:10:18 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:10:28 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup 19-04-03 01:10:38 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup

httplib2.ServerNotFoundError: Unable to find the server at www.googleapis.com

Please add info about your configuration here, along with a brief description of what you were doing and what happened. Detail is always helpful for investigating an error. You can enable verbos logging by setting {"verbose": true} in your add-on configuration and including that here. :

Traceback (most recent call last):
File "/app/backup/engine.py", line 92, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 277, in _checkForBackup
self._syncSnapshots()
File "/app/backup/engine.py", line 214, in _syncSnapshots
self.folder_id = self.drive.getFolderId()
File "/app/backup/drive.py", line 193, in getFolderId
return self._createDriveFolder()
File "/app/backup/drive.py", line 201, in _createDriveFolder
folder = self._retryDriveServiceCall(self._drive().files().create(body=file_metadata, fields='id'))
File "/app/backup/drive.py", line 65, in _drive
return build(DRIVE_SERVICE, DRIVE_VERSION, credentials=self.creds)
File "/usr/lib/python3.6/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/googleapiclient/discovery.py", line 224, in build
requested_url, discovery_http, cache_discovery, cache, developerKey)
File "/usr/lib/python3.6/site-packages/googleapiclient/discovery.py", line 274, in _retrieve_discovery_doc
resp, content = http.request(actual_url)
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1926, in request
cachekey,
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1595, in _request
conn, request_uri, method, body, headers
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1508, in _conn_request
raise ServerNotFoundError("Unable to find the server at %s" % conn.host)
httplib2.ServerNotFoundError: Unable to find the server at www.googleapis.com

Need to use a bearer token to access binary sensor

Just upgraded to HA 0.92.1, now I'm getting this error in the log:

api_password is going to deprecate. You need to use a bearer token to access /api/states/binary_sensor.snapshots_stale from 172.30.32.2

This was also mentioned on the community forum topic by another user 22 days ago. I could not find a mention of it in the issues here on Github. Is this something that should be fixed?

binary_sensor.snapshots_stale Error

I'm seeing a lot of these errors in the logs:

2019-04-25 10:39:49,394 ERROR 
Traceback (most recent call last):
  File "/app/backup/hassio.py", line 277, in _postHaData
    requests.post(self.config.haBaseUrl() + path, headers=self.config.getHaHeaders(), json=data).raise_for_status()
  File "/usr/lib/python3.6/site-packages/requests/models.py", line 940, in raise_for_status
    raise HTTPError(http_error_msg, response=self)
requests.exceptions.HTTPError: 502 Server Error: Bad Gateway for url: http://hassio/homeassistant/api/states/binary_sensor.snapshots_stale

It's using the wrong URL (http://hassio/homeassistant/) for my HA as I use my own domain. Is there a way to change this?

User personal API key's

Hi,

I saw in your docs:

Google stores a history of information about the number of requests, number of errors, and latency of requests made by this Add-on and makes a graph of that visible to me. This is needed because Google only gives me a certain quota for requests shared between all users of the add-on, so I need to be aware if someone is abusing

Maybe it's possible for you tell us how we can put in our own API key's or credentials so we run it on our own account. That way we release strain on your requests; plus people who like to not be depended on other people's cloud (or in this case your keys)

Hope you will consider!
Thanks for this awesome plugin.

Error cannot access /data/credentials.dat

Hello,

After starting the addon I get the following error in the log:

2019-05-05 13:30:32,146 INFO Loading config from /data/options.json
/usr/lib/python3.6/site-packages/oauth2client/_helpers.py:255: UserWarning: Cannot access /data/credentials.dat: No such file or directory
warnings.warn(_MISSING_FILE_MESSAGE.format(filename))

Is there a way to solve this?

httplib2.ServerNotFoundError: Unable to find the server at www.googleapis.com

I have been getting this error but it did upload once without issues.

Traceback (most recent call last):
File "/app/backup/engine.py", line 94, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 300, in _checkForBackup
self._syncSnapshots()
File "/app/backup/engine.py", line 238, in _syncSnapshots
self.folder_id = self.drive.getFolderId()
File "/app/backup/drive.py", line 170, in getFolderId
folder = self._retryDriveServiceCall(self._drive().files().get(fileId=folder_id, fields='id,trashed,capabilities'))
File "/app/backup/drive.py", line 64, in _drive
return build(DRIVE_SERVICE, DRIVE_VERSION, credentials=self.creds)
File "/usr/lib/python3.6/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/googleapiclient/discovery.py", line 224, in build
requested_url, discovery_http, cache_discovery, cache, developerKey)
File "/usr/lib/python3.6/site-packages/googleapiclient/discovery.py", line 274, in _retrieve_discovery_doc
resp, content = http.request(actual_url)
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1926, in request
cachekey,
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1595, in _request
conn, request_uri, method, body, headers
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1508, in _conn_request
raise ServerNotFoundError("Unable to find the server at %s" % conn.host)
httplib2.ServerNotFoundError: Unable to find the server at www.googleapis.com

www folder backup

Hello,
how can I backup www folder in HA ?
When I make full snapshot www folder misssing.
Výstřižek

Exception: A snapshot is already in progress

Hassio 0.90.2 .... after the issue the CPU usage stays with a high value of 34%

Traceback (most recent call last):
File "/app/backup/engine.py", line 90, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 244, in _checkForBackup
self.snapshots.append(self.hassio.newSnapshot())
File "/app/backup/hassio.py", line 74, in newSnapshot
raise Exception("A snapshot is already in progress")
Exception: A snapshot is already in progress

Critical bug: when trying to restore backups on new install old backups get deleted without prompt

So, what happened:

I had your Add-On setup with generational backup and up to 30 backups. Now my Hass.io died on me and (as for the last time this happened) I went the new and easy route using your addon to restore.
So I installed your addon and connected it to my drive. Thing is: because the standard settings are 4 backups in Google Drive it just deleted all my backups except for the last 4.

This happened without any prompt. Just out of nowhere, while waiting for my most recent backup to download it deleted the snapshots. They aren't present in Google Drives recycle bin.

(I even thought a second that this could lead to misbehaviour but I didn't expect it to just delete files without asking.)

Possible solutions

So, after a new setup there should be some kind of prompt "I found various Backups in your Drive. The amount exceeds your current settings. Should I delete them?" If "no" is pressed it could ask again when accessing the frontend for the next time.
Or another nice approach: write the current settings in a file within the same Google Drive folder. After authenticating the Add-On it could look for this file first and prompt "I have found older settings with this content. Your current settings are these. Do you want to restore your old Add-On settings before going on?

-> I guess the two things eventually should be combined.

I think I didn't really loose any important data, but I did loose older snapshots which I wanted save in case of. Thats a thing that just shouldn't happen with an addon like this.
Hope this doesn't sound too bitchy, english isn't my mothers tongue, I just want to express this bug is critical.

Settings pop-up should have save button always visible

First off let me say you have done a great job and this add-in has helped me greatly.

The first time I got to the settings pop-up I did not realize it was more than what was showing. It was not clear that you could scroll and how to save content.

I was thinking that it might be nice to show the save button at the top. It would also be nice to have smarts to tell you you have changed settings and confirm you do not want to save when hitting ESC.

socket.timeout: The read operation timed out

I honestly have no idea what causes this error, but it's happening regularly. The only thing I can think of is maybe it's something in my firewall config..?

Traceback (most recent call last):
File "/app/backup/engine.py", line 95, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 310, in _checkForBackup
self._syncSnapshots()
File "/app/backup/engine.py", line 245, in _syncSnapshots
drive_snapshots = self.drive.readSnapshots(self.folder_id)
File "/app/backup/drive.py", line 144, in readSnapshots
for child in self._iterateQuery(q="'{}' in parents".format(parent_id)):
File "/app/backup/drive.py", line 231, in _iterateQuery
response = self._retryDriveServiceCall(request)
File "/app/backup/drive.py", line 160, in _retryDriveServiceCall
return request.execute()
File "/usr/lib/python3.6/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/googleapiclient/http.py", line 846, in execute
method=str(self.method), body=self.body, headers=self.headers)
File "/usr/lib/python3.6/site-packages/googleapiclient/http.py", line 183, in _retry_request
raise exception
File "/usr/lib/python3.6/site-packages/googleapiclient/http.py", line 164, in _retry_request
resp, content = http.request(uri, method, *args, **kwargs)
File "/usr/lib/python3.6/site-packages/oauth2client/transport.py", line 175, in new_request
redirections, connection_type)
File "/usr/lib/python3.6/site-packages/oauth2client/transport.py", line 282, in request
connection_type=connection_type)
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1926, in request
cachekey,
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1595, in _request
conn, request_uri, method, body, headers
File "/usr/lib/python3.6/site-packages/httplib2/init.py", line 1533, in _conn_request
response = conn.getresponse()
File "/usr/lib/python3.6/http/client.py", line 1331, in getresponse
response.begin()
File "/usr/lib/python3.6/http/client.py", line 297, in begin
version, status, reason = self._read_status()
File "/usr/lib/python3.6/http/client.py", line 258, in _read_status
line = str(self.fp.readline(_MAXLINE + 1), "iso-8859-1")
File "/usr/lib/python3.6/socket.py", line 586, in readinto
return self._sock.recv_into(b)
File "/usr/lib/python3.6/ssl.py", line 1012, in recv_into
return self.read(nbytes, buffer)
File "/usr/lib/python3.6/ssl.py", line 874, in read
return self._sslobj.read(len, buffer)
File "/usr/lib/python3.6/ssl.py", line 631, in read
v = self._sslobj.read(len, buffer)
socket.timeout: The read operation timed out

generational backup misbehaving?

Hi, I think I ran into an issue.
I configured the configurational backup. (Thanks for the amazingly fast implementation btw!)

I used the following configuration:

{
  "max_snapshots_in_hassio": 3,
  "max_snapshots_in_google_drive": 30,
  "days_between_snapshots": 1,
  "use_ssl": false,
  "send_error_reports": true,
  "generational_days": 6,
  "generational_weeks": 3,
  "generational_months": 11,
  "generational_years": 10
}

While it started good (4th of april, next 11th of april), I afterwards still have all my daily backups in Google drive.
image
Shouldn't it delete everythung but 18th and 25th?

Or am I missunderstanding the behaviour? Does it start deleting only if it gets to the max limit? But then, why did it "work" at the beginning.

Clearly, it's no serious bug as no data is lost, I'm just wondering if something is wrong or I just don't understand.

Ask for password twice?

Hi, I like very much your addon, it's simply great.
How about asking twice for the password in the UI settings?
As you say "an erroneous comma makes you unable to turn on any of the lights in your house".
Cheers,
Piero

Web UI not loading on SSL

Hi all,

Problem 1: when i start the plugin i get this error and then the plugin continue the start-up
05-15 22:51:53 INFO Loading config from /data/options.json /usr/lib/python3.6/site-packages/oauth2client/_helpers.py:255: UserWarning: Cannot access /data/credentials.dat: No such file or directory warnings.warn(_MISSING_FILE_MESSAGE.format(filename))

Problem 2: If i enable SSL on hassio and i try to access the WebUi i get an ERR_CONNECTION_TIMED_OUT from the browser, even if the ssl is enabled or disabled also in the plugin

Please help me :)

problem with binary sensore snapshot stale

I installed this wonderful addon on hassio 32bit 0.90.2, hassos 2.10, hassio supervisor 152. It works all perfectly excluded the binary sensor which is always false, so I can't get notifications. I don't know how to be useful in the attractiveness of the binary sensor I read device class: problem

This is awesome.

I don't have any issues, I just wanted to say that this is one of the best addons I've ever seen. Thank you so much for your work, and for going the extra mile and making it so easy to use and reliable. Other addons should use this as an example of how to make the UI easy to configure.

TypeError: setPending() missing 2 required positional arguments: 'retain_drive' and 'retain_ha'

Traceback (most recent call last):
File "/app/backup/engine.py", line 150, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 382, in _checkForBackup
self.startSnapshot()
File "/app/backup/engine.py", line 291, in startSnapshot
snapshot = self.hassio.newSnapshot(custom_name=custom_name, retain_drive=retain_drive, retain_ha=retain_ha)
File "/app/backup/hassio.py", line 201, in newSnapshot
self.pending_snapshot.setPending("Pending Snapshot", nowutc())
TypeError: setPending() missing 2 required positional arguments: 'retain_drive' and 'retain_ha'

I can´t start after the last update

Hello

I updated today the add-on to 0.95, and now I´m unable to start it.

Nothing appears in log

image

I have reinstalled the add-on without luck.

I don´t know what to do really...

TypeError: replace() argument 2 must be str, not None

Traceback (most recent call last):
File "/app/backup/engine.py", line 150, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 382, in _checkForBackup
self.startSnapshot()
File "/app/backup/engine.py", line 291, in startSnapshot
snapshot = self.hassio.newSnapshot(custom_name=custom_name, retain_drive=retain_drive, retain_ha=retain_ha)
File "/app/backup/hassio.py", line 204, in newSnapshot
raise self.pending_snapshot_error # pylint: disable-msg=E0702
File "/app/backup/hassio.py", line 104, in _getSnapshot
backup_name: str = self.getSnapshotName(snapshot_type, self._custom_name)
File "/app/backup/hassio.py", line 160, in getSnapshotName
template = template.replace("{version_hassos}", self.host_info.get('hassos', ''))
TypeError: replace() argument 2 must be str, not None

Log file is full from snapshot info entries .....

System log
19-03-31 02:51:02 INFO (MainThread) [hassio.api.security] /snapshots/32773edf/info access from cebe7a76_hassio_google_drive_backup
19-03-31 02:51:05 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/snapshot_backup.state access from cebe7a76_hassio_google_drive_backup
19-03-31 02:51:05 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup
19-03-31 03:01:31 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token
19-03-31 03:31:31 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token
19-03-31 03:51:08 INFO (MainThread) [hassio.api.security] /snapshots access from cebe7a76_hassio_google_drive_backup
19-03-31 03:51:08 INFO (MainThread) [hassio.api.security] /snapshots/b8f41e9f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 03:51:08 INFO (MainThread) [hassio.api.security] /snapshots/d1beaf22/info access from cebe7a76_hassio_google_drive_backup
19-03-31 03:51:08 INFO (MainThread) [hassio.api.security] /snapshots/3f58545f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 03:51:09 INFO (MainThread) [hassio.api.security] /snapshots/32773edf/info access from cebe7a76_hassio_google_drive_backup
19-03-31 03:51:10 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/snapshot_backup.state access from cebe7a76_hassio_google_drive_backup
19-03-31 03:51:10 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup
19-03-31 04:01:32 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token
19-03-31 04:31:32 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token
19-03-31 04:51:14 INFO (MainThread) [hassio.api.security] /snapshots access from cebe7a76_hassio_google_drive_backup
19-03-31 04:51:14 INFO (MainThread) [hassio.api.security] /snapshots/b8f41e9f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 04:51:14 INFO (MainThread) [hassio.api.security] /snapshots/d1beaf22/info access from cebe7a76_hassio_google_drive_backup
19-03-31 04:51:14 INFO (MainThread) [hassio.api.security] /snapshots/3f58545f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 04:51:14 INFO (MainThread) [hassio.api.security] /snapshots/32773edf/info access from cebe7a76_hassio_google_drive_backup
19-03-31 04:51:16 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/snapshot_backup.state access from cebe7a76_hassio_google_drive_backup
19-03-31 04:51:16 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup
19-03-31 05:01:32 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token
19-03-31 05:31:32 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token
19-03-31 05:51:19 INFO (MainThread) [hassio.api.security] /snapshots access from cebe7a76_hassio_google_drive_backup
19-03-31 05:51:19 INFO (MainThread) [hassio.api.security] /snapshots/b8f41e9f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:51:20 INFO (MainThread) [hassio.api.security] /snapshots/d1beaf22/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:51:20 INFO (MainThread) [hassio.api.security] /snapshots/3f58545f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:51:20 INFO (MainThread) [hassio.api.security] /snapshots/32773edf/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:51:22 INFO (MainThread) [hassio.api.security] /snapshots/new/full access from cebe7a76_hassio_google_drive_backup
19-03-31 05:51:22 INFO (MainThread) [hassio.snapshots] Full-Snapshot 4ab58093 start
19-03-31 05:51:22 INFO (MainThread) [hassio.snapshots] Snapshot 4ab58093 store Add-ons
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on 15ef4d2f_esphome
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on a0d7b954_glances
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on core_duckdns
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on core_check_config
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on core_configurator
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on core_ssh
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on core_samba
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on core_mosquitto
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on cebe7a76_hassio_google_drive_backup
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon core_check_config
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon core_configurator
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon a0d7b954_glances
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon core_samba
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon cebe7a76_hassio_google_drive_backup
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon core_mosquitto
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon core_ssh
19-03-31 05:51:22 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon core_duckdns
19-03-31 05:51:32 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/snapshot_backup.state access from cebe7a76_hassio_google_drive_backup
19-03-31 05:51:32 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup
19-03-31 05:51:40 INFO (SyncWorker_12) [hassio.docker.addon] Export image 62c7908d/armv7-addon-autobackup to /data/tmp/tmpojv0c6fr/image.tar
19-03-31 05:51:41 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon 15ef4d2f_esphome
19-03-31 05:51:42 INFO (SyncWorker_12) [hassio.docker.addon] Export image 62c7908d/armv7-addon-autobackup done
19-03-31 05:51:42 INFO (MainThread) [hassio.addons.addon] Build snapshot for add-on 62c7908d_autobackup
19-03-31 05:52:54 INFO (MainThread) [hassio.addons.addon] Finish snapshot for addon 62c7908d_autobackup
19-03-31 05:52:54 INFO (MainThread) [hassio.snapshots] Snapshot 4ab58093 store folders
19-03-31 05:52:54 INFO (SyncWorker_4) [hassio.snapshots.snapshot] Snapshot folder addons/local
19-03-31 05:52:54 INFO (SyncWorker_8) [hassio.snapshots.snapshot] Snapshot folder share
19-03-31 05:52:54 INFO (SyncWorker_14) [hassio.snapshots.snapshot] Snapshot folder ssl
19-03-31 05:52:54 INFO (SyncWorker_13) [hassio.snapshots.snapshot] Snapshot folder homeassistant
19-03-31 05:52:54 INFO (SyncWorker_4) [hassio.snapshots.snapshot] Snapshot folder addons/local done
19-03-31 05:52:54 INFO (SyncWorker_8) [hassio.snapshots.snapshot] Snapshot folder share done
19-03-31 05:52:54 INFO (SyncWorker_14) [hassio.snapshots.snapshot] Snapshot folder ssl done
19-03-31 05:57:11 INFO (SyncWorker_13) [hassio.snapshots.snapshot] Snapshot folder homeassistant done
19-03-31 05:57:12 INFO (MainThread) [hassio.api.security] /snapshots access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:13 INFO (MainThread) [hassio.api.security] /snapshots/b8f41e9f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:13 INFO (MainThread) [hassio.api.security] /snapshots/d1beaf22/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:13 INFO (MainThread) [hassio.api.security] /snapshots/3f58545f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:13 INFO (MainThread) [hassio.api.security] /snapshots/32773edf/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:15 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/snapshot_backup.state access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:29 INFO (MainThread) [hassio.snapshots] Full-Snapshot 4ab58093 done
19-03-31 05:57:35 INFO (MainThread) [hassio.api.security] /snapshots access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:35 INFO (MainThread) [hassio.api.security] /snapshots/b8f41e9f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:35 INFO (MainThread) [hassio.api.security] /snapshots/3f58545f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:35 INFO (MainThread) [hassio.api.security] /snapshots/4ab58093/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:35 INFO (MainThread) [hassio.api.security] /snapshots/d1beaf22/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:35 INFO (MainThread) [hassio.api.security] /snapshots/32773edf/info access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:37 INFO (MainThread) [hassio.api.security] /snapshots/32773edf/remove access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:37 INFO (MainThread) [hassio.snapshots] Removed snapshot file 32773edf
19-03-31 05:57:37 INFO (MainThread) [hassio.api.security] /snapshots/4ab58093/download access from cebe7a76_hassio_google_drive_backup
19-03-31 05:57:37 INFO (MainThread) [hassio.api.snapshots] Download snapshot 4ab58093
19-03-31 05:59:53 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/snapshot_backup.state access from cebe7a76_hassio_google_drive_backup
19-03-31 05:59:54 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup
19-03-31 06:00:04 INFO (MainThread) [hassio.api.security] /snapshots access from cebe7a76_hassio_google_drive_backup
19-03-31 06:00:04 INFO (MainThread) [hassio.api.security] /snapshots/b8f41e9f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 06:00:04 INFO (MainThread) [hassio.api.security] /snapshots/d1beaf22/info access from cebe7a76_hassio_google_drive_backup
19-03-31 06:00:04 INFO (MainThread) [hassio.api.security] /snapshots/4ab58093/info access from cebe7a76_hassio_google_drive_backup
19-03-31 06:00:04 INFO (MainThread) [hassio.api.security] /snapshots/3f58545f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 06:00:05 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/snapshot_backup.state access from cebe7a76_hassio_google_drive_backup
19-03-31 06:00:05 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup
19-03-31 06:01:32 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token
19-03-31 06:31:32 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token
19-03-31 07:00:09 INFO (MainThread) [hassio.api.security] /snapshots access from cebe7a76_hassio_google_drive_backup
19-03-31 07:00:09 INFO (MainThread) [hassio.api.security] /snapshots/b8f41e9f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 07:00:09 INFO (MainThread) [hassio.api.security] /snapshots/d1beaf22/info access from cebe7a76_hassio_google_drive_backup
19-03-31 07:00:09 INFO (MainThread) [hassio.api.security] /snapshots/4ab58093/info access from cebe7a76_hassio_google_drive_backup
19-03-31 07:00:09 INFO (MainThread) [hassio.api.security] /snapshots/3f58545f/info access from cebe7a76_hassio_google_drive_backup
19-03-31 07:00:11 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/snapshot_backup.state access from cebe7a76_hassio_google_drive_backup
19-03-31 07:00:11 INFO (MainThread) [hassio.api.proxy] /homeassistant/api/states/binary_sensor.snapshots_stale access from cebe7a76_hassio_google_drive_backup
19-03-31 07:01:32 INFO (MainThread) [hassio.homeassistant] Updated Home Assistant API token

Exception: Request to Hassio failed, HTTP error: <Response [400]> Message: {"result": "error", "message": null}

Hello,
I am having some problems with this add-on.

This started on my first installation of the add-on. I had assumed it was my lack of reading the instructions and not enabling 'use_ssl', as making that true and restarting it fixed the issue. However a few days pass and now this has started raising itself again.

The web interface for the G.Drive Backup shows the below, and the latest backup has not been uploaded - shows 'waiting'.

Running HASS 0.91.2 on a Pi 3, HassOS 2.10.

Traceback (most recent call last):
File "/app/backup/engine.py", line 92, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 278, in _checkForBackup
self._purgeHaSnapshots()
File "/app/backup/engine.py", line 271, in _purgeHaSnapshots
self.hassio.deleteSnapshot(oldest_hassio)
File "/app/backup/hassio.py", line 124, in deleteSnapshot
self._postHassioData(delete_url, {})
File "/app/backup/hassio.py", line 170, in _postHassioData
return self._validateHassioReply(requests.post(url, headers=HEADERS, json = json_data))
File "/app/backup/hassio.py", line 153, in _validateHassioReply
raise Exception('Request to Hassio failed, HTTP error: {0} Message: {1}'.format(resp, resp.text))
Exception: Request to Hassio failed, HTTP error: <Response [400]> Message: {"result": "error", "message": null}

Thanks very much for your time.

Have my backup a normal behaviour?

Hi there!

I´m testing this amazing add-on, everything seems its working but I have a doubt:

When I make a backup, direct or schedulled, that backup doesn´t upload to google drive when the snapshot is finished in local. I have to wait till next schedulled backup. In other words, the last backup doesn´t go to Google Drive ultil next day.

If I press the Sync button the last backup starts to upload inmediatly to Google Drive, so the sync process it´s working but for some reason when the last backup finish, the add-on is unable to trigger the sync to Google Drive. It remains in state pending

image

Is this normal???

Best regards

Error about binary sensor

Hi,
I have a lot of erros in system log:

Error on call https://172.30.32.1:8123/api/states/binary_sensor.snapshots_stale: Cannot connect to host 172.30.32.1:8123 ssl:None [Connection refused]

It seems it tries to access to another ip.

Where can I configure this binary_sensor?

Bad Gateway

Running version 0.52

I'm getting the following error.

Traceback (most recent call last): File "/app/backup/hassio.py", line 184, in _postHaData requests.post(self.config.haBaseUrl() + path, headers=headers, json = data).raise_for_status() File "/usr/lib/python3.6/site-packages/requests/models.py", line 940, in raise_for_status raise HTTPError(http_error_msg, response=self) requests.exceptions.HTTPError: 502 Server Error: Bad Gateway for url: http://hassio/homeassistant/api/states/binary_sensor.snapshots_stale

Spelling error

I found a funny spelling error in "Snapshot State". Here is a screenshot.

Skärmklipp

Sort backup listing newest at top.

It would be easier when the list grows to show the current active backup along with newest first. This would avoid needing to scroll.

This is more a feature request :)

googleapiclient.errors.ResumableUploadError: <HttpError 403 "The user's Drive storage quota has been exceeded.">

Please add info about your configuration here, along with a brief description of what you were doing and what happened. Detail is always helpful for investigating an error. You can enable verbos logging by setting {"verbose": true} in your add-on configuration and including that here. :

Traceback (most recent call last):
File "/app/backup/engine.py", line 83, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 296, in _checkForBackup
self.drive.saveSnapshot(to_backup, self.hassio.downloadUrl(to_backup), self.folder_id)
File "/app/backup/drive.py", line 100, in saveSnapshot
status2, drive_response = self._retryDriveServiceCall(request, func=lambda a: a.next_chunk())
File "/app/backup/drive.py", line 151, in _retryDriveServiceCall
raise e
File "/app/backup/drive.py", line 146, in _retryDriveServiceCall
return func(request)
File "/app/backup/drive.py", line 100, in
status2, drive_response = self._retryDriveServiceCall(request, func=lambda a: a.next_chunk())
File "/usr/lib/python3.6/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/googleapiclient/http.py", line 927, in next_chunk
raise ResumableUploadError(resp, content)
googleapiclient.errors.ResumableUploadError: <HttpError 403 "The user's Drive storage quota has been exceeded.">

googleapiclient.errors.ResumableUploadError: <HttpError 400 "Invalid value for: Invalid format: "2019-03-29T12:15:36" is too short">

Ran first backup. Below appeared. Backup seems to have gone through completely though. Marked as backed up in add-on.

Traceback (most recent call last):
File "/app/backup/engine.py", line 90, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 263, in _checkForBackup
self.drive.saveSnapshot(to_backup, self.hassio.downloadUrl(to_backup), self.folder_id)
File "/app/backup/drive.py", line 130, in saveSnapshot
status2, response = self._retryDriveServiceCall(request, func = lambda a : a.next_chunk())
File "/app/backup/drive.py", line 182, in _retryDriveServiceCall
raise e
File "/app/backup/drive.py", line 172, in _retryDriveServiceCall
return func(request)
File "/app/backup/drive.py", line 130, in
status2, response = self._retryDriveServiceCall(request, func = lambda a : a.next_chunk())
File "/usr/lib/python3.6/site-packages/googleapiclient/_helpers.py", line 130, in positional_wrapper
return wrapped(*args, **kwargs)
File "/usr/lib/python3.6/site-packages/googleapiclient/http.py", line 927, in next_chunk
raise ResumableUploadError(resp, content)
googleapiclient.errors.ResumableUploadError: <HttpError 400 "Invalid value for: Invalid format: "2019-03-29T12:15:36" is too short">

Exception: Request to Hassio failed, HTTP error: <Response [400]> Message: {"result": "error", "message": null}

Hi,
I just installed this, but I'm getting an error. It creates the snapshot OK, I'm running home assistant 0.91.4.
Thanks!

Traceback (most recent call last):
File "/app/backup/engine.py", line 96, in doBackupWorkflow
self._checkForBackup()
File "/app/backup/engine.py", line 314, in _checkForBackup
self._purgeHaSnapshots()
File "/app/backup/engine.py", line 300, in _purgeHaSnapshots
self.hassio.deleteSnapshot(oldest_hassio)
File "/app/backup/hassio.py", line 164, in deleteSnapshot
self._postHassioData(delete_url, {})
File "/app/backup/hassio.py", line 247, in _postHassioData
return self._validateHassioReply(requests.post(url, headers=self.config.getHassioHeaders(), json=json_data))
File "/app/backup/hassio.py", line 225, in _validateHassioReply
raise Exception('Request to Hassio failed, HTTP error: {0} Message: {1}'.format(resp, resp.text))
Exception: Request to Hassio failed, HTTP error: <Response [400]> Message: {"result": "error", "message": null}

Top level menu overlaps

The top level menu items overlaps with other UI elements when the screen is resized down.

double backup created

I've this addon setup to create snapshots every 3 days at 3:00am but it creates every time 2 snapshots, first one at 3:00 and second one at around 3:12; this is actually reducing by 50% the amount of real snapshots history as every time 2 snapshots are created and actually counted!

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.