Code Monkey home page Code Monkey logo

baseplate-cloudflare-worker's Introduction

baseplate-cloudflare-worker

Local Development

One-time setup

  1. Ask for an invite to the Baseplate cloudflare worker. Accept the invite, create a Cloudflare account.
  2. Install Wrangler CLI globally npm i @cloudflare/wrangler -g
  3. wrangler login
  4. Clone the github repo
  5. Set up Local env vars

Each time

  1. pnpm install (only necessary after a git pull that changed package.json or pnpm-lock.yaml)
  2. pnpm start

Now you can go to http://localhost:8787/walmart/systemjs.importmap and see an import map in the browser

Local env vars

Copy the .dev.vars.example file to .dev.vars, then fill in the environment variable values. Some of the values can be retrieved by running terraform output -json and looking for the following values:

  • AWS_REGION
  • dev_cf_AWS_ACCESS_KEY_ID
  • dev_cf_AWS_SECRET_ACCESS_KEY
  • dev_cf_TIMESTREAM_DATABASE
  • dev_cf_TIMESTREAM_TABLE

Database / KV Storage

All local development is done against the same Cloudflare KV Storage, which acts as our database. The KV Storage is shared between all developers, which means that we can accidentally break things for each other by modifying it. To avoid issues, each developer should create their own customer/organization and develop against that, so that if they break things it's only for themself rather than for everyone.

To create an organization, or to sync an existing organization to have correct data, run the following command (replace walmart with name of your organization):

# In windows, try this with powershell or adjust the back slashes for forward slashes if using Command Prompt
bash ./scripts/sync-org.sh walmart

baseplate-cloudflare-worker's People

Contributors

joeldenning avatar ethankarp avatar rhiannonstanford avatar

Stargazers

Ray Dai avatar Hayden Bickerton avatar  avatar  avatar dillan teagle avatar

Watchers

Leah McBeth avatar  avatar Luke Schunk avatar  avatar

baseplate-cloudflare-worker's Issues

Support for auto-environment redirect

Let the user do <script type="systemjs-importmap" src="https://cdn.baseplate.cloud/orgKey/systemjs.importmap"> in their static html file but then have it auto redirect to the test cdn in test environments, based on the Origin HTTP header.

Rewrite baseplate.cloud urls in import map for custom domains

Example Import Map: https://joel-baseplate-dev-prod.single-spa-playground.org/dev/systemjs.importmap

What it is currently:

{
  "imports": {
    "@joel/navbar": "https://dev-cdn.baseplate.cloud/joel/dev/apps/navbar/main.js",
    "@joel/navbar/": "https://dev-cdn.baseplate.cloud/joel/dev/apps/navbar/"
  },
  "scopes": {
  }
}

What it should be:

{
  "imports": {
    "@joel/navbar": "https://joel-baseplate-dev-prod.single-spa-playground.org/dev/apps/navbar/main.js",
    "@joel/navbar/": "https://joel-baseplate-dev-prod.single-spa-playground.org/dev/apps/navbar/"
  },
  "scopes": {
  }
}

Error when S3 bucket is incorrect

{
  "outcome": "ok",
  "scriptName": "cdn-tf-worker-dev",
  "diagnosticsChannelEvents": [],
  "exceptions": [],
  "logs": [
    {
      "message": [
        "NoSuchBucket: The specified bucket does not exist"
      ],
      "level": "error",
      "timestamp": 1708464741916
    },
    {
      "message": [
        "The specified bucket does not exist"
      ],
      "level": "error",
      "timestamp": 1708464741916
    },
    {
      "message": [
        "NoSuchBucket: The specified bucket does not exist\n    at S3ServiceException2.ServiceException3 [as constructor] (main.js:9542:24)\n    at new S3ServiceException2 (main.js:9649:24)\n    at main.js:11694:22\n    at step (main.js:255:25)\n    at Object.next (main.js:202:20)\n    at fulfilled (main.js:173:30)"
      ],
      "level": "error",
      "timestamp": 1708464741916
    }
  ],
  "eventTimestamp": 1708464741718,
  "event": {
    "request": {
      "url": "https://dev-cdn.baseplate.cloud/convex/dev/apps/navbar/convex-navbar.js",
      "method": "GET",
      "headers": {
        "accept": "text/html,application/xhtml+xml,application/xml;q=0.9,image/avif,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.7",
        "accept-encoding": "gzip",
        "accept-language": "en-US,en;q=0.9",
        "cache-control": "max-age=0",
        "cf-connecting-ip": "65.121.68.194",
        "cf-ipcountry": "US",
        "cf-ray": "8589dc9bbe926c3c",
        "cf-visitor": "{\"scheme\":\"https\"}",
        "connection": "Keep-Alive",
        "cookie": "REDACTED",
        "host": "dev-cdn.baseplate.cloud",
        "priority": "u=0, i",
        "sec-ch-ua": "\"Not A(Brand\";v=\"99\", \"Google Chrome\";v=\"121\", \"Chromium\";v=\"121\"",
        "sec-ch-ua-mobile": "?0",
        "sec-ch-ua-platform": "\"macOS\"",
        "sec-fetch-dest": "document",
        "sec-fetch-mode": "navigate",
        "sec-fetch-site": "none",
        "sec-fetch-user": "?1",
        "upgrade-insecure-requests": "1",
        "user-agent": "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_15_7) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/121.0.0.0 Safari/537.36",
        "x-forwarded-proto": "https",
        "x-real-ip": "65.121.68.194"
      },
      "cf": {
        "longitude": "-111.85650",
        "latitude": "40.73110",
        "tlsCipher": "AEAD-AES128-GCM-SHA256",
        "continent": "NA",
        "asn": 209,
        "clientAcceptEncoding": "gzip, deflate, br",
        "country": "US",
        "tlsClientRandom": "uo6tYqcUZwkUbwjF5Zpi64U4IB3nYxWuw2tyBHHb4aI=",
        "tlsClientAuth": {
          "certIssuerDNLegacy": "",
          "certIssuerSKI": "",
          "certSubjectDNRFC2253": "",
          "certSubjectDNLegacy": "",
          "certFingerprintSHA256": "",
          "certNotBefore": "",
          "certSKI": "",
          "certSerial": "",
          "certIssuerDN": "",
          "certVerified": "NONE",
          "certNotAfter": "",
          "certSubjectDN": "",
          "certPresented": "0",
          "certRevoked": "0",
          "certIssuerSerial": "",
          "certIssuerDNRFC2253": "",
          "certFingerprintSHA1": ""
        },
        "verifiedBotCategory": "",
        "tlsExportedAuthenticator": {
          "clientFinished": "aa0ce921575dcbfbd819d220067de3dfde72ff7128576d70f02ec1f8074a5a7d",
          "clientHandshake": "79aa7a662490303273f996120db48092b9905c3e6d098593990f48ca5a6b44c3",
          "serverHandshake": "bcd17621800e8aaaf1e2e73e7a9210745fe1dd9370d6d6b6dbf4786110854723",
          "serverFinished": "38ae37a57f6bf3ae6d2110183d3329ec30fe2c581f0202a915fd0b60dfb98545"
        },
        "tlsVersion": "TLSv1.3",
        "city": "Salt Lake City",
        "timezone": "America/Denver",
        "colo": "DFW",
        "edgeRequestKeepAliveStatus": 1,
        "postalCode": "84105",
        "tlsClientHelloLength": "731",
        "region": "Utah",
        "httpProtocol": "HTTP/3",
        "regionCode": "UT",
        "asOrganization": "CenturyLink",
        "metroCode": "770",
        "requestPriority": ""
      }
    },
    "response": {
      "status": 500
    }
  },
  "id": 0
}

Add Foundry-Version HTTP header to all responses

We should add a Foundry-Version HTTP response header for all of our responses. The foundry version is the version in the package.json.

To do this, we need to add the following:

  1. Install Changesets. pnpm add --dev @changesets/cli @changesets/changelog-github && pnpm exec changesets init. Change the changesets config.json to use changelog github plugin and to use the main branch rather than master (use utils repo as example). In the github workflow .github/workflows/build.yaml, add pnpm exec changeset status at the end, inside of a - run: block (again, use utils repo as example). Then ask me to install changesets bot for this repo. This can be its own separate PR before proceeding to step 2
  2. Verify that you can get the package.json version by using import packageJson from '../package.json'; and then seeing if packageJson.version is what it needs to be. Use console.log to verify this.
  3. Create a foundryVersion.ts file. Export a function foundryVersionHeaders() which returns an object {"Foundry-Version": packageJson.version}. Then in handleImportMap and handleApp, add the foundryVersionHeaders in the same way that cors headers are being added. You can copy the pattern for cors headers pretty much exactly. Verify in browser network tab that you see Foundry-Version coming through as an HTTP response header.

Add ability to fetch from S3 rather than public internet

Currently, our handleApps code always calls fetch() to retrieve a file from a proxy origin server.

https://github.com/JustUtahCoders/single-spa-foundry-worker/blob/5b6cd66d43ecf4fc8889227f60451c6282a20c62/src/handleApps.ts#L46

However, that requires that the proxy server be a public server with all the files public. Our customers might not like that in every case. The customers who pay us to store their static files in our aws account might prefer that their s3 bucket not be public. In those situations, we should use @aws-sdk/s3 to retrieve the file rather than fetch().

The way to determine whether to use fetch() or the aws sdk is by looking at the customHost orgSetting. If the customHost starts with s3:// (rather than http:// or https:// then we know to retrieve it from s3. See this documentation which discusses s3:// urls

You should update the sync-org.sh script to use the following customHost in the orgSettings, and then update your orgs to use it: s3://public-web-files-juc/. There is a file there react-mf-navbar.js that, when this task is completed, should be available at http://127.0.0.1:8787/walmart/apps/react-mf-navbar.js

To use the aws sdk, you'll need access to an s3 bucket. I am sending you those credentials on Slack. You'll need to make those credentials available in the cloudflare worker as a secret. Do not put the env variable values directly into the wrangler.toml as we don't want the credentials to be leaked (credentials should never be committed to git). Set the AWS_ACCESS_KEY_ID, AWS_SECRET_ACCESS_KEY, and AWS_DEFAULT_REGION secrets and then use those to manually authenticate with AWS sdk.

Once you have the credentials set up, see this documentation and go to "Get an object from a bucket". Notice that you need a bucket name and an object key in order to do that. The way s3 urls are structured is the following s3://bucket-name/keypart1/keypart2/keypart3. The full object key is keypart1/keypart2/keypart3. Some code like the following is on the right path to do that.

const url = new URL(customHost);

if (url.protocol === 's3:') {
  const [Bucket, ...keyParts] = url.pathname.split('/');
  const Key = keyParts.join('/');
  const bucketParams = {Bucket, Key}
}

Note aws sdk will not give you a Response object, but you'll need to convert what they give you into a Response object since that's what Cloudflare requires. Note that s3 objects do have HTTP headers, which will need to be forwarded on to the browser. Also note that the aws sdk will give you a ReadableStream for the response data, which is great because you can call new Response(someReadableStream) and it will know how to handle that. See Response constructor docs for details on that. Do not convert the readable stream to a string like it has in the aws sdk example code, since that would be worse for performance.

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    ๐Ÿ–– Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. ๐Ÿ“Š๐Ÿ“ˆ๐ŸŽ‰

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google โค๏ธ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.