Code Monkey home page Code Monkey logo

hass-addons's Introduction

Home Assistant Add-ons: Greg's repository

Add-ons for Home Assistant written by Greg Rapp.

Add-ons can be installed and configured via the Home Assistant frontend on systems that have installed Home Assistant.

Add-ons provided by this repository

hass-addons's People

Contributors

gdrapp avatar

Stargazers

 avatar  avatar

Watchers

 avatar  avatar

hass-addons's Issues

AttributeError: 'NoneType' object has no attribute 'encode'

Hello,

I am trying to get my backups into S3 however when I try and run the add-on I get the following error:

[s6-init] making user provided files available at /var/run/s6/etc...exited 0.
[s6-init] ensuring user provided files have correct perms...exited 0.
[fix-attrs.d] applying ownership & permissions fixes...
[fix-attrs.d] done.
[cont-init.d] executing container initialization scripts...
[cont-init.d] 00-banner.sh: executing... 
-----------------------------------------------------------
 Add-on: Amazon S3 Backup
 Automatically backup Home Assistant snapshots to Amazon S3
-----------------------------------------------------------
 Add-on version: 1.1
 You are running the latest version of this add-on.
 System: Home Assistant OS 6.4  (aarch64 / raspberrypi4-64)
 Home Assistant Core: 2021.9.7
 Home Assistant Supervisor: 2021.09.6
-----------------------------------------------------------
 Please, share the above information when looking for help
 or support in, e.g., GitHub, forums or the Discord chat.
-----------------------------------------------------------
[cont-init.d] 00-banner.sh: exited 0.
[cont-init.d] 01-log-level.sh: executing... 
Log level is set to DEBUG
[cont-init.d] 01-log-level.sh: exited 0.
[cont-init.d] done.
[services.d] starting services
[services.d] done.
[20:35:09] INFO: Starting Amazon S3 Backup...
DEBUG:__main__:Local file /backup/0a98d79d.tar found in S3 with matching size of 101488640 bytes
WARNING:__main__:Local file /backup/4286cc97.tar not found in S3
Traceback (most recent call last):
  File "/usr/bin/amazon-s3-backup/amazon-s3-backup.py", line 201, in <module>
    upload_file(file, s3_bucket, supervisor_api)
  File "/usr/bin/amazon-s3-backup/amazon-s3-backup.py", line 153, in upload_file
    s3_bucket.upload_file(str(file), metadata)
  File "/usr/bin/amazon-s3-backup/s3bucket.py", line 64, in upload_file
    self.s3_client.upload_file(Filename=file,
  File "/usr/lib/python3.8/site-packages/boto3/s3/inject.py", line 129, in upload_file
    return transfer.upload_file(
  File "/usr/lib/python3.8/site-packages/boto3/s3/transfer.py", line 279, in upload_file
    future.result()
  File "/usr/lib/python3.8/site-packages/s3transfer/futures.py", line 106, in result
    return self._coordinator.result()
  File "/usr/lib/python3.8/site-packages/s3transfer/futures.py", line 265, in result
    raise self._exception
  File "/usr/lib/python3.8/site-packages/s3transfer/tasks.py", line 126, in __call__
    return self._execute_main(kwargs)
  File "/usr/lib/python3.8/site-packages/s3transfer/tasks.py", line 150, in _execute_main
    return_value = self._main(**kwargs)
  File "/usr/lib/python3.8/site-packages/s3transfer/upload.py", line 692, in _main
    client.put_object(Bucket=bucket, Key=key, Body=body, **extra_args)
  File "/usr/lib/python3.8/site-packages/botocore/client.py", line 357, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/usr/lib/python3.8/site-packages/botocore/client.py", line 648, in _make_api_call
    request_dict = self._convert_to_request_dict(
  File "/usr/lib/python3.8/site-packages/botocore/client.py", line 694, in _convert_to_request_dict
    api_params = self._emit_api_params(
  File "/usr/lib/python3.8/site-packages/botocore/client.py", line 723, in _emit_api_params
    self.meta.events.emit(
  File "/usr/lib/python3.8/site-packages/botocore/hooks.py", line 356, in emit
    return self._emitter.emit(aliased_event_name, **kwargs)
  File "/usr/lib/python3.8/site-packages/botocore/hooks.py", line 228, in emit
    return self._emit(event_name, kwargs)
  File "/usr/lib/python3.8/site-packages/botocore/hooks.py", line 211, in _emit
    response = handler(**kwargs)
  File "/usr/lib/python3.8/site-packages/botocore/handlers.py", line 529, in validate_ascii_metadata
    value.encode('ascii')
AttributeError: 'NoneType' object has no attribute 'encode'
[cont-finish.d] executing container finish scripts...
[cont-finish.d] 99-message.sh: executing... 
[cont-finish.d] 99-message.sh: exited 0.
[cont-finish.d] done.
[s6-finish] waiting for services.
[s6-finish] sending all processes the TERM signal.
[s6-finish] sending all processes the KILL signal and exiting.

I thought it was just one bad apple, but deleted that one, and then another came up with an issue.

If you need anymore info please let me know

AttributeError: 'NoneType' object has no attribute 'get'

Hello, I got such error when I tried to use your extension:

Exception in thread Thread-1:
Traceback (most recent call last):
  File "/usr/lib/python3.8/threading.py", line 932, in _bootstrap_inner
    self.run()
  File "/usr/lib/python3.8/site-packages/watchdog/observers/api.py", line 199, in run
    self.dispatch_events(self.event_queue, self.timeout)
  File "/usr/lib/python3.8/site-packages/watchdog/observers/api.py", line 368, in dispatch_events
    handler.dispatch(event)
  File "/usr/lib/python3.8/site-packages/watchdog/events.py", line 537, in dispatch
    _method_map[event_type](event)
  File "/usr/bin/amazon-s3-backup/amazon-s3-backup.py", line 34, in on_created
    self.process(event)
  File "/usr/bin/amazon-s3-backup/amazon-s3-backup.py", line 53, in process
    upload_file(Path(event.src_path),
  File "/usr/bin/amazon-s3-backup/amazon-s3-backup.py", line 145, in upload_file
    snapshot_detail = supervisor_api.get_snapshot(slug)
  File "/usr/bin/amazon-s3-backup/supervisorapi.py", line 105, in get_snapshot
    return response.get("data")
AttributeError: 'NoneType' object has no attribute 'get'

No module names boto3

Hi,

I'm super motivated to get this working. I'm not sure how it's not a core feature of HASS. So, thanks for putting in the effort :)

Having an issue when running it though - logs suggest boto3 isn't installed:

Traceback (most recent call last):
  File "/usr/bin/amazon-s3-backup/amazon-s3-backup.py", line 12, in <module>
    from s3bucket import S3Bucket, S3BucketError
  File "/usr/bin/amazon-s3-backup/s3bucket.py", line 3, in <module>
    import boto3
ModuleNotFoundError: No module named 'boto3'

Config looks correct to me, so not sure if there's a step I missed in the installation?

Handle more S3 providers?

Hi Greg,

I'm reaching out to see if you're interested into adding more s3 providers into your aws-s3-backup repo?

I've already cloned your repo and all your wonderful work, and I have adapted it to include WASABI as an s3 provider. If this repo is something you want to keep pushing forward, then I'll make a PR to you to contribute to it.

or, if you want to keep it AWS-centric, I can respect that and will leave it be.

Thanks, and nice work, it was just what I was looking for!

G.

Can't change storage class

Can you please provide examples on supported storage classes?
I tried "One Zone-IA" and it was throwing out errors saying it's unsupported

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.