Code Monkey home page Code Monkey logo

kappa's People

Contributors

astewart-twist avatar bruno-carrier-lookout avatar christophermanning avatar coopernurse avatar garnaat avatar gumuz avatar iserko avatar josegonzalez avatar laiso avatar pas256 avatar rodrigosaito avatar ryansb avatar samuel-soubeyran avatar wvidana avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

kappa's Issues

S3 Event Source fails on update

The S3 event source causes any "update_event_sources" command to fail because it's missing a method.

Traceback (most recent call last):
  File "/home/ryansb/.pyenv/versions/hugo-lambda/bin/kappa", line 155, in <module>
    cli(obj={})
  File "/home/ryansb/.pyenv/versions/hugo-lambda/lib/python2.7/site-packages/click/core.py", line 664, in __call__
    return self.main(*args, **kwargs)
  File "/home/ryansb/.pyenv/versions/hugo-lambda/lib/python2.7/site-packages/click/core.py", line 644, in main
    rv = self.invoke(ctx)
  File "/home/ryansb/.pyenv/versions/hugo-lambda/lib/python2.7/site-packages/click/core.py", line 991, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/home/ryansb/.pyenv/versions/hugo-lambda/lib/python2.7/site-packages/click/core.py", line 837, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/home/ryansb/.pyenv/versions/hugo-lambda/lib/python2.7/site-packages/click/core.py", line 464, in invoke
    return callback(*args, **kwargs)
  File "/home/ryansb/.pyenv/versions/hugo-lambda/bin/kappa", line 150, in update_event_sources
    context.update_event_sources()
  File "/home/ryansb/.pyenv/versions/hugo-lambda/lib/python2.7/site-packages/kappa/context.py", line 123, in update_event_sources
    event_source.update(self.function)
AttributeError: 'S3EventSource' object has no attribute 'update'

starting troubles

First off, i wanted to mention how awesome this project idea is and thank you for implementing it. I am just starting off with trying kappa and ran into the following issue when trying to run kappa for the first time. The readme seems to be out of date with the current commandline options. The commandline help doesn't really say what it expects for the "config" placeholder. I tried passing in the file name "kappa.yml" for the config option and ended up with the following error. Any thoughts/suggestions on what could be going wrong here?

(lambdaStats) osboxes@osboxes:~/projects/lambdaStats2> kappa --debug kappa.yml status
Traceback (most recent call last):
File "/home/osboxes/envs/lambdaStats/bin/kappa", line 155, in
cli(obj={})
File "/home/osboxes/envs/lambdaStats/local/lib/python2.7/site-packages/click/core.py", line 664, in call
return self.main(_args, *_kwargs)
File "/home/osboxes/envs/lambdaStats/local/lib/python2.7/site-packages/click/core.py", line 644, in main
rv = self.invoke(ctx)
File "/home/osboxes/envs/lambdaStats/local/lib/python2.7/site-packages/click/core.py", line 991, in invoke
return _process_result(sub_ctx.command.invoke(sub_ctx))
File "/home/osboxes/envs/lambdaStats/local/lib/python2.7/site-packages/click/core.py", line 837, in invoke
return ctx.invoke(self.callback, *_ctx.params)
File "/home/osboxes/envs/lambdaStats/local/lib/python2.7/site-packages/click/core.py", line 464, in invoke
return callback(_args, **kwargs)
File "/home/osboxes/envs/lambdaStats/bin/kappa", line 98, in status
status = context.status()
File "/home/osboxes/envs/lambdaStats/local/lib/python2.7/site-packages/kappa/context.py", line 173, in status
status['function'] = self.function.status()
File "/home/osboxes/envs/lambdaStats/local/lib/python2.7/site-packages/kappa/function.py", line 192, in status
LOG.debug('getting status for function %s', self.name)
File "/home/osboxes/envs/lambdaStats/local/lib/python2.7/site-packages/kappa/function.py", line 38, in name
return self._config['name']
KeyError: 'name'

Make Kappa More Usable as a Library

I'm now using Kappa as a dependency in Zappa, but this requires from fairly ugly hacking: https://github.com/Miserlou/Zappa/blob/master/zappa/util.py#L97 This may be too much to ask for, but there are few places where it'd be really great if Kappa acted a bit more like a library and less like a client.

Specifically, the ability to pass in our own Id and LambdaArns rather than them being generated , and returning responses in addition to logging where possible. That - and comments in the code, please!

Anyway, this might be a stretch but I figure I'd ask anyway. Thanks for all the great work!

unable to import module 'index' error....

i'm getting this error after uploading and invoking the function from a mobile application....

2015-06-29T15:45:37: Unable to import module 'index': Error
    at Function.Module._resolveFilename (module.js:338:15)
    at Function.Module._load (module.js:280:25)
    at Module.require (module.js:364:17)
    at require (module.js:380:17)
    at Object.<anonymous> (/var/task/index.js:2:1)
    at Module._compile (module.js:456:26)
    at Object.Module._extensions..js (module.js:474:10)
    at Module.load (module.js:356:32)
    at Function.Module._load (module.js:312:12)
    at Module.require (module.js:364:17)

my directory structure is ....

  • index.js
  • node-modules/
  • config.yml
  • event.json
  • package.json

Deleting a function should delete its log group

If you don't delete the log group associated with a function and then you recreate the function, you can start getting InvalidAccessKey errors when you run your Lambda function. Deleting the log group will prevent this error.

Use IAM Role Granted to EC2 Instance Kappa is Run From

Hello -

Is there any way to get kappa to pickup the role permissions that a particular ec2 instance has instead of relying on a profile name?

The profile name works well on my local machine, but when I have to build libraries in EC2 and test before deployment, I'm having a hard time leveraging kappa for automatic deployment as its looking for aws config parameters to match that in kappa.yml

I can use boto within python just fine and it picks up the permissions from the host system, but somehow I think kappa is trying to force a specific profile name when setting up boto

Hard-coded excluded_dirs

function.py hard-codes 8 directories that will always be excluded. But this breaks deployment of any projects that happen to use those packages. I discovered this when the "jmespath" dependency in my project mysteriously refused to be included in the deployed package. Some code-spelunking revealed the hard-coded list in function.py.

Any chance of just removing that list entirely? As far as I can tell kappa is not caching any of those dependencies in my src folder anyway, so I don't think it should need to exclude them.

Cache for function data could live in S3 metadata

If S3 is being used for the code, the cache contents related to the function code could live in the metadata on the S3 object. This would reduce the amount of state maintained locally outside of version control.

KeyError 'EventSourceArn' when running status

Hi, I am very new to Kappa, and was using it to check the status of an existing lambda task we are using, and I get a key error for 'EventSourceArn':

  File "/Users/ses/w/lemonade-tasks/venv/bin/kappa", line 125, in status
    event_source['EventSourceArn'], event_source['State'])
KeyError: 'EventSourceArn'

(I suppressed the rest of the stack trace, it was just a few frames of click internals).

Here is the event_source dict (the only event source I have for this lambda function in fact):

{u'Endpoint': 'arn:aws:lambda:eu-west-1:575.....:function:sendEmailVerificationEmail',
 u'Owner': '575.....',
 u'Protocol': 'lambda',
 u'SubscriptionArn': 'arn:aws:sns:eu-west-1:575.....:accounts-email_verify_started:89abcdef-1234-1234-1234-456789abcdef',
 u'TopicArn': 'arn:aws:sns:eu-west-1:575.....:accounts-email_verify_started'}

I suppressed our actual IDs and stuff.

I wouldn't mind fixing this issue myself, but as I am so new to lambda I'm not sure why it is happening or which of these fields should replace the EventSourceArn in this case, or if this is a case of me having an incorrect boto version or something (boto3==1.1.0, kappa==0.3.1 FYI).

Kappa with Read-Only IAM Permissions

Hey Mitch!

First of all. thanks for writing Kappa.. just what I needed!

Unfortunately, for the project I'm currently working on, I am just a lowly developer with read only access to IAM - write access is guarded by the high and mighty guys in operations!

They've very kindly created a AWSLambdaExecute policy name for me, but it looks like Kappa requires my own account to have IAM write access to call the 'create' command for the first deployment, and update_code won't work until there is something there to update.

Any suggestions for running Kappa without being an administrator?

Even having the documentation include a minimum viable set of roles would be helpful - but even better would be a way to deploy the initial code without having to call the permissions stuff, so either have the initial work flow be:

create_policy_and_roles
deploy_code
update_code

so that I could just skip the create_users step, or, perhaps more simply, just have

update_code

call '''create_function''' in the event that there is no currently deployed code.

What do you think?

Thanks!
R

Support for CloudWatch Events as event source

I am using this as event source, this is a scheduled event on cloud watch:

event_sources:
  - arn: arn:aws:events:us-east-1:XXXX:rule/some-rule

And getting this error:

ValueError: Unknown event source: arn:aws:events:us-east-1:XXXX:rule/some-rule

I checked the code and CloudWatch events seems to not be implemented as a type of event source.

I can help with a PR if you are able to give me the directions on what I need to do to implement it.

Make new release to match readme

Hi! When I run kappa deploy, it doesn't actually run as the latest release is from June, while the package has since been updated significantly (50 or so commits). Would be great to have a released version with all the great new stuff out :)

KeyError: 'lastEventTimestamp'

Happened while tailing..

tailing logs...
Traceback (most recent call last):
  File "/Projects/lambdas/python/env/bin/kappa", line 155, in <module>
    cli(obj={})
  File "/Projects/lambdas/python/env/lib/python2.7/site-packages/click/core.py", line 664, in __call__
    return self.main(*args, **kwargs)
  File "/Projects/lambdas/python/env/lib/python2.7/site-packages/click/core.py", line 644, in main
    rv = self.invoke(ctx)
  File "/Projects/lambdas/python/env/lib/python2.7/site-packages/click/core.py", line 991, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/Projects/lambdas/python/env/lib/python2.7/site-packages/click/core.py", line 837, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Projects/lambdas/python/env/lib/python2.7/site-packages/click/core.py", line 464, in invoke
    return callback(*args, **kwargs)
  File "/Projects/lambdas/python/env/bin/kappa", line 89, in tail
    for e in context.tail()[-10:]:
  File "/Projects/lambdas/python/env/lib/python2.7/site-packages/kappa/context.py", line 150, in tail
    return self.function.tail()
  File "/Projects/lambdas/python/env/lib/python2.7/site-packages/kappa/function.py", line 97, in tail
    return self.log.tail()
  File "/Projects/lambdas/python/env/lib/python2.7/site-packages/kappa/log.py", line 57, in tail
    elif stream['lastEventTimestamp'] > latest_stream['lastEventTimestamp']:
KeyError: 'lastEventTimestamp'

Normally works fine, not sure why it happened those few times. Cleared up shortly with no other changes. Just thought you should be aware.

Problem adding s3 event source

When trying to add_event_sources (having already succesfully run a kappa config.yml create and a kappa config.yml invoke) that are s3 (the following is reproduced based on the s3 sample in the repo, albeit in eu-west-1), I get the following error:

$ kappa config.yml add_event_sources
adding event sources...
Traceback (most recent call last):
  File "/Users/me/venv_aws/bin/kappa", line 155, in <module>
    cli(obj={})
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 664, in __call__
    return self.main(*args, **kwargs)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 644, in main
    rv = self.invoke(ctx)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 991, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 837, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 464, in invoke
    return callback(*args, **kwargs)
  File "/Users/me/venv_aws/bin/kappa", line 142, in add_event_sources
    context.add_event_sources()
  File "/Users/me/venv_aws/lib/python2.7/site-packages/kappa/context.py", line 119, in add_event_sources
    event_source.add(self.function)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/kappa/event_source.py", line 139, in add
    'InvocationRole': self._context.invoke_role_arn}}
AttributeError: 'Context' object has no attribute 'invoke_role_arn'

I've found the invoke_role_arn function was removed in commit 2bbf5fa.

Additionally if I manually add the event source, I get the following error when trying to get the status (this is having upgraded boto3 to 0.0.18 to make the package work at all):

$ kappa config.yml status
Policy
    TestLambdaPolicy201506151108 (arn:aws:iam::505016xxxxxx:policy/kappa/TestLambdaPolicy201506151108)
Role
    TestLambdaRole201506151108 (arn:aws:iam::505016xxxxxx:role/kappa/TestLambdaRole201506151108)
Function
    S3Sample201506151108 (arn:aws:lambda:eu-west-1:505016xxxxxx:function:S3Sample201506151108)
Event Sources
Traceback (most recent call last):
  File "/Users/me/venv_aws/bin/kappa", line 155, in <module>
    cli(obj={})
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 664, in __call__
    return self.main(*args, **kwargs)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 644, in main
    rv = self.invoke(ctx)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 991, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 837, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 464, in invoke
    return callback(*args, **kwargs)
  File "/Users/me/venv_aws/bin/kappa", line 124, in status
    event_source['EventSourceArn'], event_source['State'])
KeyError: 'EventSourceArn'

question re log group

Hi,
On running the step "Run kappa --config tail to view the functions output in CloudWatch logs", I'm getting an exception:

console excerpt

stimpy ::โ€นdevelop*โ€บ /home/dacaba/repos/kappa/samples/kinesis
ยป kappa --debug --config config.yml tail 1 โ†ต
2015-01-16 10:14:21,763 - kappa.function - DEBUG - tailing function: KinesisSample
2015-01-16 10:14:21,767 - kappa.log - DEBUG - tailing log group: /aws/lambda/KinesisSample
2015-01-16 10:14:21,767 - kappa.log - DEBUG - getting streams for log group: /aws/lambda/KinesisSample
Traceback (most recent call last):
File "/usr/local/bin/kappa", line 5, in
pkg_resources.run_script('kappa==0.1.0', 'kappa')
...

stuff elided

...
raise ClientError(parsed_response, operation_name)
botocore.exceptions.ClientError: An error occurred (ResourceNotFoundException) when calling the DescribeLogStreams operation: The specified log group does not exist.

end console excerpt

For sake of this experiment, I created a kinesis stream, and used its arn to update the config.yml file. Should I also have created the /aws/lambda/KinesisSample Cloudwatch log group and log stream, or is kappa supposed to be handling that?

Thank you,
drc

use aws-keychain for credentials

I don't usually write out my credentials into a credential file, but store them using aws-keychain. Would it be possible to have an option to not have profile mandatory?

Support Python3

running unit test cases with Py3, failed due to a MD5 compare checking in context.py

Log Group Has Not Been Created

Not sure what to make of this one.

$ kappa MyEvent.yaml tail
tailing logs...
    log group /aws/lambda/MyEvent has not been created yet
...done

Shouldn't this have been done during the create step?

I have other Lambda functions for this bucket that work fine. I can also execute this event in the console just fine and I see the log there.

What's going on here?

Following symlinks

I'm sharing some code between several lambda functions. Instead of creating a module and installing it in each _src directory I am finding it easier to symlink to the common code like so:

func1/_src/common -> ../../common
func2/_src/common -> ../../common

In order to make this work for the zipfile creation I need to change os.walk(lambda_dir) to os.walk(lambda_dir, followlinks=True) in function.py.

Would you be open to a PR to make this configurable or is there a better way to share code between lambda functions?

S3 Event Source delete function not working as expected

The S3 Event Source delete function is not working as expected. This is tangentially related to issue #88. The delete function expects that only one notification has been set up. It checks the first entry in"CloudFunctionConfiguration" to see if the function arn matches. If it does, it removes the whole "CloudFunctionConfiguration" block.

This in incorrect for two reasons:

  • If a different notification has been set up and appears before the expected notification, the expected notification will not be deleted.
  • If a different notification has been set up and appears after the expected notification, the expected notification and the other notification will both be deleted.

EDIT:
It's not that the code checks for the first entry in "CloudFunctionConfiguration" it's that it uses the deprecated get-bucket-notification which only returns the first notification. The result is still the same.

S3 Event Source status function not working as expected

The S3 Event Source status function is not working as expected. This is tangentially related to issue #88, #89. The code is using the deprecated get-bucket-notification which only returns the first notification. This may or may not be the expected notification.

Also the function is returning the EventSourceArn as the function arn. This should be the bucket arn.

Error: pkg_resources._vendor.packaging.requirements.InvalidRequirement

Hi,
I installed the latest development branch with

pip install git+https://github.com/garnaat/kappa.git

and then when I run kappa, I get an error:

console> kappa
Traceback (most recent call last):
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/bin/kappa", line 5, in <module>
    from pkg_resources import load_entry_point
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2912, in <module>
    @_call_aside
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2898, in _call_aside
    f(*args, **kwargs)
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2925, in _initialize_master_working_set
    working_set = WorkingSet._build_master()
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 642, in _build_master
    ws.require(__requires__)
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 943, in require
    needed = self.resolve(parse_requirements(requirements))
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 838, in resolve
    new_requirements = dist.requires(req.extras)[::-1]
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2462, in requires
    dm = self._dep_map
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2686, in _dep_map
    self.__dep_map = self._compute_dependencies()
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/__init__.py", line 2696, in _compute_dependencies
    current_req = packaging.requirements.Requirement(req)
  File "/Users/rabraham/Documents/dev/python/virtual_envs/tracker_env/lib/python2.7/site-packages/pkg_resources/_vendor/packaging/requirements.py", line 94, in __init__
    requirement_string[e.loc:e.loc + 8]))
pkg_resources._vendor.packaging.requirements.InvalidRequirement: Invalid requirement, parse error at "'and plat'"

When I install kappa with pip install kappa it works but I am trying to get the latest version which reflects the Github docs.

Python support

As introduced on re:invent 2015, AWS Lambda now supports python as runtime language.

Exception if _resources_ is missing

If statements is used instead of resources, kappa throws a KeyError.

Stacktrace:

deploying
...deploying policy ses_send_mail_dev
Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/2.7/bin//kappa", line 9, in <module>
    load_entry_point('kappa==0.3.1', 'console_scripts', 'kappa')()
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/click/core.py", line 716, in __call__
    return self.main(*args, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/click/core.py", line 696, in main
    rv = self.invoke(ctx)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/click/core.py", line 1060, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/click/core.py", line 889, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/click/core.py", line 534, in invoke
    return callback(*args, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/click/decorators.py", line 64, in new_func
    return ctx.invoke(f, obj, *args[1:], **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/click/core.py", line 534, in invoke
    return callback(*args, **kwargs)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/kappa/scripts/cli.py", line 58, in deploy
    ctx.deploy()
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/kappa/context.py", line 222, in deploy
    self.policy.deploy()
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/kappa/policy.py", line 138, in deploy
    document = self.document()
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/kappa/policy.py", line 52, in document
    for resource in self.config['policy']['resources']:
KeyError: 'resources'

Kappa config:

 dev:
    profile: default
    region: eu-west-1
    policy:
        statements:
            - Effect: Allow
              Resource: "*"
              Action:
                - "logs:*"
            - Effect: Allow
              Resource: "*"
              Action:
                - "ses:Send*"

Support S3 code (avoid creating zip)

First off, thanks for creating this tool. I like the simplicity.

On my project I'm using Java, so I already have build tooling in place via mvn package. This produces a single JAR that contains several lambda functions.

Consequently I'd like to be able to do something like this:

#!/bin/bash
set -e

# build JAR and upload to S3
mvn clean package
aws s3 cp target/myproj-with-deps.jar s3://mybucket/myproj.jar

# use kappa to register lambdas
kappa --config lambda/func1.yml deploy
kappa --config lambda/func2.yml deploy

I have a quick and dirty version of the above working on my fork. I augmented the lambda section of the config to support an option code block. If this block is present, the S3 keys are set in the create_function and update_function_code calls. For example:

---
name: hello
lambda:
  # new optional section - if present, no ZIP is created and S3 code is used
  code:
    bucket: mybucket
    key: myproj.jar    
  description: Hello Cats
  handler: com.bitmechanic.foo.LambdaHello::handleRequest
  runtime: java8
  memory_size: 128
  timeout: 10

One consequence of my current implementation is that the config file is not generated in the S3 case, as this is done as part of the ZIP bundling. This is what I'd expect, as I'm not relying on kappa to bundle my code for me, so I wouldn't expect configuration manifests to be injected into my JAR.

Would you entertain a PR with this work?

I'm open to feedback on the YAML changes. I also implemented this by factoring out all the ZIP stuff into a separate class in function.py such there are S3Code and ZipCode classes. Function.update and Function.create delegate to these classes.

Before I submit a PR I need to clean things up, write docs, etc. But before I got too far I wanted to open up the conversation here and get preliminary feedback on the enhancement.

Thanks!

Include python dependencies

Hi All,

Two feature suggestions...

Pull in python dependencies
Is there any appetite to have a command line option that pulls in python dependencies during the the deploy command?

Separate source and build directories
Executing pip install -r requirements.txt -t /path/to/source/ creates a lot of noise in the source directory. It would be great if we could create a separate build directory to create the lambda zip package.

I've got some time to spend on this if people think it could be useful.

Cheers,

Anthony.

Feature suggestions

I've implemented a few changes in my fork. They are, I think, mostly orthogonal to the python-refactor branch changes. Let me know if you'd be interested in a pull request for any of them.

  1. Make config file an option, default to looking for kappa.yml or kappa.yaml in any parent folder.
  2. Allow multiple policies. Automatically add an inline policy for CloudWatch logs to role. Allow policy documents to be specified inline
  3. Default to using 'src/' if lambda.path is not provided in config file.
  4. Use the name of the directory containing the kappa.ya?ml file as the default name for function and role if they are not provided.
  5. Commandline input for invoke

IAM caching: better to query against actual state instead?

Would it make more sense to, when a deploy happens, query the role and policy in AWS and compare their contents against the local info? This would help overwrite changes made directly to those policies and roles if, for example, extra permissions were given for testing.

S3 Event Source should add permission for S3 bucket to invoke function

Currently an S3 event source does not work out of the box as you would expect:

    event_sources:
      - arn: arn:aws:s3:::<bucket>
        events:
          - s3:ObjectCreated:*
        key_filters:
          - type: prefix
            value: <prefix>/
          - type: suffix
            value: <suffix>

The issue is that the S3 bucket does not have permissions to invoke the function. The workaround is to add the permission in the "permissions" section under "lambda":

lambda:
  description: <description>
  handler: <function>.lambda_handler
  runtime: python2.7
  memory_size: 128
  timeout: 300
  permissions:
    - action: lambda:invokeFunction
      principal: s3.amazonaws.com
      SourceArn: arn: arn:aws:s3:::<bucket>

This permission should automatically be added in the S3 event source add function similar to how the cloudwatch event source add function handles adding the relevant permission:

e.g. something like

               response = self._lambda.call('add_permission',
                                             FunctionName=function.name,
                                             StatementId=str(uuid.uuid4()),
                                             Action='lambda:InvokeFunction',
                                             Principal='s3.amazonaws.com',
                                             SourceArn=self._get_bucket_name())

Zip is created uncompressed

I'm happy to write a fix, but I'm not sure which direction you'd like to go

  1. compress the zipfile by default
  2. compress the zipfile if a flag is passed (e.g. --compress)
  3. compress by default, allow a --no-compress flag
  4. automatically decide based on the size of the zip

Thoughts?

Error if you are trying to use an existing role

>> kappa config.yml create
creating...
    Error creating Role
Traceback (most recent call last):
  File "/opt/boxen/homebrew/lib/python2.7/site-packages/kappa/role.py", line 80, in create
    AssumeRolePolicyDocument=AssumeRolePolicyDocument)
  File "/opt/boxen/homebrew/lib/python2.7/site-packages/botocore/client.py", line 197, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/opt/boxen/homebrew/lib/python2.7/site-packages/botocore/client.py", line 252, in _make_api_call
    raise ClientError(parsed_response, operation_name)
ClientError: An error occurred (EntityAlreadyExists) when calling the CreateRole operation: Role with name lambda_exec_role already exists.
...done

Please release Kappa 0.6.1

Would be nice if we could get a small version bump after the recent S3-related merge, I'd like to do a new Zappa release today or tomorrow that depends on that commit.

Error turning sample into Python. Python AWS Lambda sample?

I'm trying to adapt the s3 sample based on node.js to python. I tried by modifying the lambda section of the config.yml:

lambda:
  name: S3PythonSample
  zipfile_name: S3PythonSample.zip
  description: Testing S3 Lambda Python handler
  path: examplefolder/
  handler: hello_lambda.lambda_handler
  runtime: python

However, I get this error after running `kappa ./config.yml create':

$ kappa ./config.yml create 
creating...
/Library/Python/2.7/site-packages/botocore/vendored/requests/packages/urllib3/connection.py:251: SecurityWarning: Certificate has no `subjectAltName`, falling back to check for a `commonName` for now. This feature is being removed by major browsers and deprecated by RFC 2818. (See https://github.com/shazow/urllib3/issues/497 for details.)
  SecurityWarning
/Library/Python/2.7/site-packages/botocore/vendored/requests/packages/urllib3/connection.py:251: SecurityWarning: Certificate has no `subjectAltName`, falling back to check for a `commonName` for now. This feature is being removed by major browsers and deprecated by RFC 2818. (See https://github.com/shazow/urllib3/issues/497 for details.)
  SecurityWarning
        Unable to upload zip file
Traceback (most recent call last):
  File "/Library/Python/2.7/site-packages/kappa/function.py", line 162, in create
    MemorySize=self.memory_size)
  File "/Library/Python/2.7/site-packages/botocore/client.py", line 258, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/Library/Python/2.7/site-packages/botocore/client.py", line 312, in _make_api_call
    raise ClientError(parsed_response, operation_name)
ClientError: An error occurred (ValidationException) when calling the CreateFunction operation: 1 validation error detected: Value 'python' at 'runtime' failed to satisfy constraint: Member must satisfy enum value set: [nodejs]
        Unable to add permission
Traceback (most recent call last):
  File "/Library/Python/2.7/site-packages/kappa/function.py", line 141, in add_permissions
    response = self._lambda_svc.add_permission(**kwargs)
  File "/Library/Python/2.7/site-packages/botocore/client.py", line 258, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/Library/Python/2.7/site-packages/botocore/client.py", line 312, in _make_api_call
    raise ClientError(parsed_response, operation_name)

SNS Topic subscriptions not working(?)

I've configured some SNS Topics as event sources for my Lambda functions but it doesn't work correctly:

  1. The subscription is created correctly.
  2. Lambda function is not triggered when I publish some message at the subscribed topic.
  3. The subscription does not appears in "Triggers" tab at AWS Console

To make it works as expected I need to run the command at item 4 of AWS Lambda Documentation below:

http://docs.aws.amazon.com/lambda/latest/dg/with-sns-create-x-account-permissions.html

This step add SNS Topic as a trigger (event source) of Lambda function.

Error Running Current Pip Version

Hello,

When I try and run any kappa command after installing via pip (v0.3.0), I get the following error:

$ kappa config.yml status
Traceback (most recent call last):
  File "/Users/me/venv_aws/bin/kappa", line 155, in <module>
    cli(obj={})
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 664, in __call__
    return self.main(*args, **kwargs)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 644, in main
    rv = self.invoke(ctx)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 991, in invoke
    return _process_result(sub_ctx.command.invoke(sub_ctx))
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 837, in invoke
    return ctx.invoke(self.callback, **ctx.params)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/click/core.py", line 464, in invoke
    return callback(*args, **kwargs)
  File "/Users/me/venv_aws/bin/kappa", line 97, in status
    context = Context(ctx.obj['config'], ctx.obj['debug'])
  File "/Users/me/venv_aws/lib/python2.7/site-packages/kappa/context.py", line 39, in __init__
    self, self.config['iam']['policy'])
  File "/Users/me/venv_aws/lib/python2.7/site-packages/kappa/policy.py", line 26, in __init__
    aws = kappa.aws.get_aws(context)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/kappa/aws.py", line 37, in get_aws
    __Singleton_AWS = __AWS(context.profile, context.region)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/kappa/aws.py", line 22, in __init__
    region_name=region_name, profile_name=profile_name)
  File "/Users/me/venv_aws/lib/python2.7/site-packages/boto3/session.py", line 76, in __init__
    self._setup_loader()
  File "/Users/me/venv_aws/lib/python2.7/site-packages/boto3/session.py", line 96, in _setup_loader
    [self._loader.data_path,
AttributeError: 'Loader' object has no attribute 'data_path'

If I upgrade boto3 to 0.0.18 or higher (from 0.0.16 as is installed by setup.py) with pip (I notice 0.0.21 is out now), most features seem to work.

Existing role configuration

Does kappa support using existing IAM roles? (I'm assuming it does based on the kinesis sample)
I see the kinesis sample with the "iam" block but it doesn't seem to include the "environments" block.

I just get an error saying "Invalid environment dev specified"

Updating cloudwatch event source causes trigger to show multiple times in AWS console

When updating a cloudwatch event source, the add function is called

def update(self, function):
        self.add(function)

The add function does 3 things:

  • It calls events put-rule which seems to be fine as it firstly creates the rule and then updates the same rule on later calls.
  • It calls events put-targets which seems to be fine as it firstly creates the target and then updates the same target on later calls.
  • It calls lambda add-permission. This seems to be the issue. Each call uses a unique "statement-id". This results in multiple statements being added. This can be seen by calling get-policy:

aws lambda get-policy --function-name <function_name>

{
  "Version": "2012-10-17",
  "Statement": [
    {
      "Condition": {
        "ArnLike": {
          "AWS:SourceArn": "arn:aws:events:us-east-1:<account>:rule/<rule_name>"
        }
      },
      "Action": "lambda:InvokeFunction",
      "Resource": "arn:aws:lambda:us-east-1:<account>:function:<function_name>",
      "Effect": "Allow",
      "Principal": {
        "Service": "events.amazonaws.com"
      },
      "Sid": "df9ee7d2-2fd9-4058-9637-97b4c35a7d2a"
    },
    {
      "Condition": {
        "ArnLike": {
          "AWS:SourceArn": "arn:aws:events:us-east-1:<account>:rule/rule_name"
        }
      },
      "Action": "lambda:InvokeFunction",
      "Resource": "arn:aws:lambda:us-east-1:<account>:function:<function_name>",
      "Effect": "Allow",
      "Principal": {
        "Service": "events.amazonaws.com"
      },
      "Sid": "e3867f15-1b94-4d4c-a3f1-205865b9a5a7"
    }
  ],
  "Id": "<id>"
}

This results in the Rule showing up multiple times under "Triggers" for the lambda function in the AWS console.

Perhaps when calling update the old statement id should be used? Or perhaps "add_permission" should not be called on update? Or is this an issue that should be raised with AWS?

Cannot create Event Sources

Hi,

I love kappa and just started using it. Its very well written.

But, for some reason I cannot get "Event Sources" to connect. Here is the error:

$ kappa ./config.yml add_event_sources 
adding event sources...
        Unable to add S3 event source
Traceback (most recent call last):
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/kappa/event_source.py", line 142, in add
    NotificationConfiguration=notification_spec)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/botocore/client.py", line 258, in _api_call
    return self._make_api_call(operation_name, kwargs)
  File "/Library/Frameworks/Python.framework/Versions/2.7/lib/python2.7/site-packages/botocore/client.py", line 312, in _make_api_call
    raise ClientError(parsed_response, operation_name)
ClientError: An error occurred (InvalidArgument) when calling the PutBucketNotification operation: Unable to validate the following destination configurations
...done

Here is my config file:


---
# Change the profile and region to suit your application
region: us-east-1
iam:
  # In this case, we are using an existing managed policy so we just
  # need to put the name of that policy here.
  policy:
    name: AWSLambdaExecute
  # The name of the IAM role used for executing the Lambda function.
  # The policy listed above will be attached to this role once it is created.
  role:
    name: tennis-lambda-s3-execution-role
lambda:
  name: cronitor_handler
  zipfile_name: cronitor_handler.zip
  description: Monitor s3 updates for Cronitor
  path:  cronitor_handler.py
  handler: lambda_function.cronitor_handler
  runtime: python2.7
  memory_size: 128
  timeout: 10
  mode: event
  test_data: event.json
  event_sources:
    - arn: arn:aws:s3:::test-ngd-db-backups
      events:
        - s3:ObjectCreated:*

Any ideas what I may be doing wrong?

Thanks
-T

Minor Documentation Error ('add-event-sources')

Small documentation error: in README.md, 'add_event_sources' is referred to as both 'add-event-sources' and 'add_event_source', neither of which are correct. Simple find and replace should fix!

Tx
R

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.