Here is my configuration (I've changed all bucket names and project ids for obvious reasons) :
...
2018/03/22 11:01:25 Setting 'log.format' from environment.
2018/03/22 11:01:25 Setting 'xpack.monitoring.elasticsearch.url' from environment.
Sending Logstash's logs to /usr/share/logstash/logs which is now configured via log4j2.properties
[2018-03-22T11:02:02,331][WARN ][logstash.runner ] --config.debug was specified, but log.level was not set to 'debug'! No config info will be logged.
[2018-03-22T11:02:02,373][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"netflow", :directory=>"/usr/share/logstash/modules/netflow/configuration"}
[2018-03-22T11:02:02,397][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"fb_apache", :directory=>"/usr/share/logstash/modules/fb_apache/configuration"}
[2018-03-22T11:02:03,460][INFO ][logstash.modules.scaffold] Initializing module {:module_name=>"arcsight", :directory=>"/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/x-pack-6.1.0-java/modules/arcsight/configuration"}
[2018-03-22T11:02:04,125][WARN ][logstash.config.source.multilocal] Ignoring the 'pipelines.yml' file because modules or command line options are specified
[2018-03-22T11:02:05,238][INFO ][logstash.runner ] Starting Logstash {"logstash.version"=>"6.1.0"}
[2018-03-22T11:02:05,845][INFO ][logstash.agent ] Successfully started Logstash API endpoint {:port=>9600}
[2018-03-22T11:02:08,562][WARN ][logstash.outputs.elasticsearch] You are using a deprecated config setting "document_type" set in elasticsearch. Deprecated settings will continue to work, but are scheduled for removal from logstash in the future. Document types are being deprecated in Elasticsearch 6.0, and removed entirely in 7.0. You should avoid this feature If you have any questions about this, please visit the #logstash channel on freenode irc. {:name=>"document_type", :plugin=><LogStash::Outputs::ElasticSearch hosts=>["http://elasticsearch:9200"], bulk_path=>"/_xpack/monitoring/_bulk?system_id=logstash&system_api_version=2&interval=1s", manage_template=>"false", document_type=>"%{[@metadata][document_type]}", sniffing=>"false", id=>"482e2490d75257e21cd2b7d49268d74674b3b7c32f0cdb0eef4694242d57f5fb">}
[2018-03-22T11:02:09,380][INFO ][logstash.outputs.elasticsearch] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2018-03-22T11:02:09,411][INFO ][logstash.outputs.elasticsearch] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elasticsearch:9200/, :path=>"/"}
[2018-03-22T11:02:09,731][WARN ][logstash.outputs.elasticsearch] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2018-03-22T11:02:09,810][INFO ][logstash.outputs.elasticsearch] ES Output version determined {:es_version=>nil}
[2018-03-22T11:02:09,874][INFO ][logstash.outputs.elasticsearch] New Elasticsearch output {:class=>"LogStash::Outputs::ElasticSearch", :hosts=>["http://elasticsearch:9200"]}
[2018-03-22T11:02:09,914][INFO ][logstash.pipeline ] Starting pipeline {:pipeline_id=>".monitoring-logstash", "pipeline.workers"=>1, "pipeline.batch.size"=>2, "pipeline.batch.delay"=>5, "pipeline.max_inflight"=>2, :thread=>"#<Thread:0xca43c07 run>"}
[2018-03-22T11:02:10,053][INFO ][logstash.licensechecker.licensereader] Elasticsearch pool URLs updated {:changes=>{:removed=>[], :added=>[http://elasticsearch:9200/]}}
[2018-03-22T11:02:10,054][INFO ][logstash.licensechecker.licensereader] Running health check to see if an Elasticsearch connection is working {:healthcheck_url=>http://elasticsearch:9200/, :path=>"/"}
[2018-03-22T11:02:10,074][WARN ][logstash.licensechecker.licensereader] Restored connection to ES instance {:url=>"http://elasticsearch:9200/"}
[2018-03-22T11:02:10,112][INFO ][logstash.licensechecker.licensereader] ES Output version determined {:es_version=>nil}
[2018-03-22T11:02:10,292][INFO ][logstash.pipeline ] Pipeline started {"pipeline.id"=>".monitoring-logstash"}
[2018-03-22T11:02:16,086][ERROR][logstash.pipeline ] Error registering plugin {:pipeline_id=>"main", :plugin=>"#<LogStash::OutputDelegator:0x270a138 @namespaced_metric=#<LogStash::Instrument::NamespacedMetric:0xd76dfbf @metric=#<LogStash::Instrument::Metric:0x7be2e6f @collector=#<LogStash::Instrument::Collector:0x7827e615 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x7f59f148 @store=#<Concurrent::Map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x75828e9e>, @fast_lookup=#<Concurrent::Map:0x00000000000fb8 entries=63 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :\"09d2e504e8de6887836a4879cca23b984f165252a32f136bcf5d24ff1cc04bb1\"]>, @metric=#<LogStash::Instrument::NamespacedMetric:0x4b9708da @metric=#<LogStash::Instrument::Metric:0x7be2e6f @collector=#<LogStash::Instrument::Collector:0x7827e615 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x7f59f148 @store=#<Concurrent::Map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x75828e9e>, @fast_lookup=#<Concurrent::Map:0x00000000000fb8 entries=63 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs]>, @logger=#<LogStash::Logging::Logger:0xe7a6303 @logger=#<Java::OrgApacheLoggingLog4jCore::Logger:0x44474f51>>, @out_counter=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: out value:0, @strategy=#<LogStash::OutputDelegatorStrategies::Single:0x373893af @mutex=#<Mutex:0x6a078c30>, @output=<LogStash::Outputs::GoogleCloudStorage bucket=>\"my-bucket\", flush_interval_secs=>5, gzip=>true, key_path=>\"/shh/key.p12\", log_file_prefix=>\"wt2\", max_file_size_kbytes=>102400, output_format=>\"plain\", service_account=>\"[email protected]\", temp_directory=>\"/usr/share/logstash/data/tmp\", id=>\"09d2e504e8de6887836a4879cca23b984f165252a32f136bcf5d24ff1cc04bb1\", enable_metric=>true, codec=><LogStash::Codecs::Plain id=>\"plain_8dd97555-2f0c-4eb1-ab7d-e3d8650557d1\", enable_metric=>true, charset=>\"UTF-8\">, workers=>1, key_password=>\"notasecret\", date_pattern=>\"%Y-%m-%dT%H:00\", uploader_interval_secs=>60>>, @in_counter=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: in value:0, @id=\"09d2e504e8de6887836a4879cca23b984f165252a32f136bcf5d24ff1cc04bb1\", @time_metric=org.jruby.proxy.org.logstash.instrument.metrics.counter.LongCounter$Proxy2 - name: duration_in_millis value:0, @metric_events=#<LogStash::Instrument::NamespacedMetric:0x14c47be6 @metric=#<LogStash::Instrument::Metric:0x7be2e6f @collector=#<LogStash::Instrument::Collector:0x7827e615 @agent=nil, @metric_store=#<LogStash::Instrument::MetricStore:0x7f59f148 @store=#<Concurrent::Map:0x00000000000fb4 entries=3 default_proc=nil>, @structured_lookup_mutex=#<Mutex:0x75828e9e>, @fast_lookup=#<Concurrent::Map:0x00000000000fb8 entries=63 default_proc=nil>>>>, @namespace_name=[:stats, :pipelines, :main, :plugins, :outputs, :\"09d2e504e8de6887836a4879cca23b984f165252a32f136bcf5d24ff1cc04bb1\", :events]>, @output_class=LogStash::Outputs::GoogleCloudStorage>", :error=>"certificate verify failed", :thread=>"#<Thread:0x7a1c234e run>"}
[2018-03-22T11:02:16,116][ERROR][logstash.pipeline ] Pipeline aborted due to error {:pipeline_id=>"main", :exception=>#<Faraday::SSLError>, :backtrace=>["org/jruby/ext/openssl/SSLSocket.java:228:in `connect_nonblock'", "/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:938:in `connect'", "/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:868:in `do_start'", "/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:857:in `start'", "/usr/share/logstash/vendor/jruby/lib/ruby/stdlib/net/http.rb:1409:in `request'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:82:in `perform_request'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:40:in `block in call'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:87:in `with_net_http_connection'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/adapter/net_http.rb:32:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/request/url_encoded.rb:15:in `call'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/rack_builder.rb:139:in `build_response'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/connection.rb:377:in `run_request'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/faraday-0.9.2/lib/faraday/connection.rb:177:in `post'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/signet-0.8.1/lib/signet/oauth_2/client.rb:967:in `fetch_access_token'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/signet-0.8.1/lib/signet/oauth_2/client.rb:1005:in `fetch_access_token!'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/google-api-client-0.8.7/lib/google/api_client/auth/jwt_asserter.rb:105:in `authorize'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-google_cloud_storage-3.0.4/lib/logstash/outputs/google_cloud_storage.rb:374:in `initialize_google_client'", "/usr/share/logstash/vendor/bundle/jruby/2.3.0/gems/logstash-output-google_cloud_storage-3.0.4/lib/logstash/outputs/google_cloud_storage.rb:132:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator_strategies/single.rb:10:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/output_delegator.rb:43:in `register'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:343:in `register_plugin'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:354:in `block in register_plugins'", "org/jruby/RubyArray.java:1734:in `each'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:354:in `register_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:743:in `maybe_setup_out_plugins'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:364:in `start_workers'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:288:in `run'", "/usr/share/logstash/logstash-core/lib/logstash/pipeline.rb:248:in `block in start'"], :thread=>"#<Thread:0x7a1c234e run>"}
[2018-03-22T11:02:16,146][ERROR][logstash.agent ] Failed to execute action {:id=>:main, :action_type=>LogStash::ConvergeResult::FailedAction, :message=>"Could not execute action: LogStash::PipelineAction::Create/pipeline_id:main, action_result: false", :backtrace=>nil}
[2018-03-22T11:02:16,241][INFO ][logstash.inputs.metrics ] Monitoring License OK
[2018-03-22T11:02:20,620][ERROR][logstash.inputs.metrics ] Failed to create monitoring event {:message=>"undefined method `system?' for nil:NilClass", :error=>"NoMethodError"}
As I said it doesn't seem to be a p12 key issue (that line is never reached) but some other odd behviour. Is this plugin still supported with Google Cloud's current storage API, or even logstash 5.x or 6.x for that matter. Any help would be much appreciated (which may include alternatives for shipping filebeat logs to GCS)