Skip to content
This repository was archived by the owner on Feb 1, 2022. It is now read-only.

Commit 8eb1eab

Browse files
authored
feat: handle private buckets (#43)
* chore: add new bucket to example site * feat: use AWS pre-signed URL to get objects from private buckets * test: add e2e test to check that the example site has 1502 images * chore: remove region plugin option The option is not required anymore when using pre-signed URLs * revert: "chore: remove region plugin option" This reverts commit d870d3c. The AWS SDK apparently still expects a region. * chore(scripts): add local start script with additional env cleanup * docs: add READMEs to the starter and testing directories * docs(tests): add details to the testing README * fix: fix testing folder structure Cypress sees any file in the integrationFolder as a spec. The spec had to be nexted one level deeper to allow for the README * docs: update README * feat: get signed URL only for image files in onCreateNode BREAKING CHANGE: the Url key doesn't exist on s3Object anymore. There is not a public URL for private buckets and file nodes are now sourced with pre-signed URLs that shouldn't be added to the schema.
1 parent a4f06b4 commit 8eb1eab

File tree

9 files changed

+204
-64
lines changed

9 files changed

+204
-64
lines changed

README.md

Lines changed: 55 additions & 39 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,11 @@
22

33
A Gatsby plugin to source objects and images from AWS S3.
44

5-
## Install
5+
## Getting started
6+
7+
### Gatsby setup
8+
9+
Install the plugin:
610

711
```bash
812
# with npm
@@ -11,18 +15,13 @@ npm install @robinmetral/gatsby-source-s3
1115
yarn add @robinmetral/gatsby-source-s3
1216
```
1317

14-
## Configure
15-
16-
Declare the plugin in your `gatsby-config.js`, taking care to pass your AWS
17-
credentials as
18+
Declare it in your `gatsby-config.js`, making sure to pass your AWS credentials
19+
as
1820
[environment variables](https://www.gatsbyjs.org/docs/environment-variables/):
1921

2022
```javascript
21-
// configure dotenv
22-
// see https://www.gatsbyjs.org/docs/environment-variables
23-
require("dotenv").config({
24-
path: `.env.${process.env.NODE_ENV}`
25-
});
23+
// gatsby-config.js
24+
require("dotenv").config();
2625

2726
module.exports = {
2827
plugins: [
@@ -32,63 +31,70 @@ module.exports = {
3231
aws: {
3332
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
3433
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
35-
region: process.env.AWS_REGION
34+
region: process.env.AWS_REGION,
3635
},
37-
buckets: ["my-bucket", "my-second-bucket"]
38-
}
39-
}
40-
]
36+
buckets: ["my-bucket", "my-second-bucket"],
37+
},
38+
},
39+
],
4140
};
4241
```
4342

44-
Currently, your buckets will need to be configured for public access with this
45-
access policy: (add your bucket name under `Statement.Resource`)
43+
### AWS setup
44+
45+
You can use the plugin both with private and public buckets.
4646

4747
```json
48-
{
49-
"Version": "2008-10-17",
50-
"Statement": [
51-
{
52-
"Sid": "AllowPublicRead",
53-
"Effect": "Allow",
54-
"Principal": {
55-
"AWS": "*"
56-
},
57-
"Action": "s3:GetObject",
58-
"Resource": "arn:aws:s3:::my-bucket/*"
59-
}
60-
]
61-
}
48+
6249
```
6350

64-
## Query
51+
## Querying S3 objects
6552

66-
S3 objects can be queried in GraphQL as "s3Object" of "allS3Object":
53+
S3 objects can be queried in GraphQL as "s3Object" or "allS3Object":
6754

6855
```graphql
6956
query AllObjectsQuery {
7057
allS3Object {
7158
nodes {
72-
Key
73-
Url
59+
Key # the object's key, i.e. file name
60+
Bucket # the object's bucket name on S3
61+
LastModified # the date the object was last modified
62+
Size # the object's size in bytes
63+
localFile # the local file node for image objects processed with sharp (see below)
7464
}
7565
}
7666
}
7767
```
7868

79-
## Image processing
69+
### Processing images with sharp
8070

8171
Any images in your bucket(s) will be downloaded by the plugin and stored as
82-
local files, to be processed with `gatsby-plugin-sharp` and
72+
local file nodes, to be processed with `gatsby-plugin-sharp` and
8373
`gatsby-transformer-sharp`.
8474

75+
If you don't have them yet, you will need to add the sharp plugin and
76+
transformer to your Gatsby site:
77+
8578
```bash
8679
# with npm
8780
npm install gatsby-plugin-sharp gatsby-transformer-sharp
8881
# with yarn
8982
yarn add gatsby-plugin-sharp gatsby-transformer-sharp
9083
```
9184

85+
```javascript
86+
// gatsby-config.js
87+
module.exports = {
88+
plugins: [
89+
// ...
90+
`gatsby-plugin-sharp`,
91+
`gatsby-transformer-sharp`,
92+
],
93+
};
94+
```
95+
96+
You can then query the processed images with GraphQL:
97+
9298
```graphql
9399
query AllImagesQuery {
94100
images: allS3Object {
@@ -106,9 +112,19 @@ query AllImagesQuery {
106112
}
107113
```
108114

115+
And use them with `gatsby-image`:
116+
117+
```javascript
118+
import Img from "gatsby-image";
119+
120+
const Image = ({ s3Object }) => (
121+
<Img fluid={s3Object.localFile.childImageSharp.fluid} />
122+
);
123+
```
124+
109125
## Thanks
110126

111-
This plugin was based on Dustin Schau's
127+
This plugin was initially based on Dustin Schau's
112128
[`gatsby-source-s3`](https://github.com/DSchau/gatsby-source-s3/) and influenced
113-
by Jesse Stuart's Typescript
129+
by Jesse Stuart's TypeScript
114130
[`gatsby-source-s3-image`](https://github.com/jessestuart/gatsby-source-s3-image).

cypress.json

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,5 +1,5 @@
11
{
2-
"integrationFolder": "tests",
2+
"integrationFolder": "tests/e2e",
33
"screenshotsFolder": "tests/screenshots",
44
"fixturesFolder": false,
55
"supportFile": false,
Lines changed: 79 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,79 @@
1+
# gatsby-starter-source-s3
2+
3+
This starter is an example of how to source objects from AWS S3 in a Gatsby site
4+
at build time, using `@robinmetral/gatsby-source-s3`.
5+
6+
It uses a local version of the plugin located in `/src`, and it can be used for
7+
local development and testing.
8+
9+
To run it locally, you'll need to add the following environment variables in a
10+
`.env` file:
11+
12+
```bash
13+
AWS_ACCESS_KEY_ID=""
14+
AWS_SECRET_ACCESS_KEY=""
15+
AWS_REGION=""
16+
```
17+
18+
## AWS S3 setup
19+
20+
This site sources images from three separate buckets:
21+
22+
1. gatsby-source-s3-example (public)
23+
2. gatsby-source-s3-example-2 (public)
24+
3. gatsby-source-s3-continuation-token (private)
25+
26+
The first two buckets are set up for public access with the following policy:
27+
28+
```json
29+
{
30+
"Version": "2008-10-17",
31+
"Statement": [
32+
{
33+
"Sid": "AllowPublicRead",
34+
"Effect": "Allow",
35+
"Principal": {
36+
"AWS": "*"
37+
},
38+
"Action": "s3:GetObject",
39+
"Resource": "arn:aws:s3:::gatsby-source-s3-example/*"
40+
}
41+
]
42+
}
43+
```
44+
45+
_Note: the resource is the bucket's arn with the `/*` scope._
46+
47+
The third bucket is private, its policy is the default for S3 (i.e. nothing was
48+
changed when creating the bucket).
49+
50+
## AWS IAM setup
51+
52+
The AWS access keys used by this example are for a `gatsby-source-s3` user to
53+
which I attached the following access policy:
54+
55+
```json
56+
{
57+
"Version": "2012-10-17",
58+
"Statement": [
59+
{
60+
"Effect": "Allow",
61+
"Action": ["s3:ListBucket"],
62+
"Resource": [
63+
"arn:aws:s3:::gatsby-source-s3-example",
64+
"arn:aws:s3:::gatsby-source-s3-example-2",
65+
"arn:aws:s3:::gatsby-source-s3-continuation-token"
66+
]
67+
},
68+
{
69+
"Effect": "Allow",
70+
"Action": ["s3:GetObject"],
71+
"Resource": [
72+
"arn:aws:s3:::gatsby-source-s3-example/*",
73+
"arn:aws:s3:::gatsby-source-s3-example-2/*",
74+
"arn:aws:s3:::gatsby-source-s3-continuation-token/*"
75+
]
76+
}
77+
]
78+
}
79+
```

examples/gatsby-starter-source-s3/gatsby-config.js

Lines changed: 10 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -2,7 +2,7 @@ require("dotenv").config();
22

33
module.exports = {
44
siteMetadata: {
5-
title: `gatsby-starter-source-s3`
5+
title: `gatsby-starter-source-s3`,
66
},
77
plugins: [
88
{
@@ -11,13 +11,17 @@ module.exports = {
1111
aws: {
1212
accessKeyId: process.env.AWS_ACCESS_KEY_ID,
1313
secretAccessKey: process.env.AWS_SECRET_ACCESS_KEY,
14-
region: process.env.AWS_REGION
14+
region: process.env.AWS_REGION,
1515
},
16-
buckets: ["gatsby-source-s3-example", "gatsby-source-s3-example-2"]
17-
}
16+
buckets: [
17+
"gatsby-source-s3-example",
18+
"gatsby-source-s3-example-2",
19+
"gatsby-source-s3-continuation-token",
20+
],
21+
},
1822
},
1923
// the sharp transformer and plugin are required to process images
2024
`gatsby-transformer-sharp`,
21-
`gatsby-plugin-sharp`
22-
]
25+
`gatsby-plugin-sharp`,
26+
],
2327
};

examples/gatsby-starter-source-s3/src/pages/index.js

Lines changed: 20 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -3,12 +3,26 @@ import { graphql } from "gatsby";
33
import Img from "gatsby-image";
44

55
const IndexPage = ({ data }) => (
6-
<>
6+
<main style={{ fontFamily: "monospace" }}>
77
<h1>{data.site.siteMetadata.title}</h1>
8-
{data.allS3Object.nodes.map(image => (
9-
<Img fixed={image.localFile.childImageSharp.fixed} alt={image.Key} />
10-
))}
11-
</>
8+
<div
9+
style={{
10+
display: "grid",
11+
gridTemplateColumns: "repeat(auto-fit,minmax(256px, 1fr))",
12+
}}
13+
className="images-grid"
14+
>
15+
{data.allS3Object.nodes.map((image) => (
16+
<div className={`s3-image ${image.Key}-${image.Bucket}`}>
17+
<Img fixed={image.localFile.childImageSharp.fixed} alt={image.Key} />
18+
<br />
19+
Key: {image.Key}
20+
<br />
21+
Bucket: {image.Bucket}
22+
</div>
23+
))}
24+
</div>
25+
</main>
1226
);
1327

1428
export const IMAGES_QUERY = graphql`
@@ -21,6 +35,7 @@ export const IMAGES_QUERY = graphql`
2135
allS3Object {
2236
nodes {
2337
Key
38+
Bucket
2439
localFile {
2540
childImageSharp {
2641
fixed(width: 256) {

package.json

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -34,8 +34,9 @@
3434
"scripts": {
3535
"build": "tsc",
3636
"lint": "eslint '*/**/*.{ts,tsx}'",
37-
"prestart": "yarn build && npm pack && (cd examples/gatsby-starter-source-s3 && yarn install && yarn add file:../../robinmetral-gatsby-source-s3-0.0.0-semantically-released.tgz)",
37+
"prestart": "yarn build && npm pack && (cd examples/gatsby-starter-source-s3 && yarn install)",
3838
"start": "(cd examples/gatsby-starter-source-s3 && gatsby build && gatsby serve)",
39+
"start:local": "yarn cache clean && (cd examples/gatsby-starter-source-s3 && rm -rf node_modules .cache public yarn.lock) && yarn start",
3940
"test": "cypress run",
4041
"e2e": "start-server-and-test http://localhost:9000"
4142
},

src/gatsby-node.ts

Lines changed: 11 additions & 12 deletions
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,8 @@
11
import { createRemoteFileNode } from "gatsby-source-filesystem";
22
import AWS = require("aws-sdk");
33

4+
const s3 = new AWS.S3();
5+
46
const isImage = (key: string): boolean =>
57
/\.(jpe?g|png|webp|tiff?)$/i.test(key);
68

@@ -26,8 +28,6 @@ export async function sourceNodes(
2628
AWS.config.update(awsConfig);
2729

2830
// get objects
29-
const s3 = new AWS.S3();
30-
3131
const getS3ListObjects = async (params: {
3232
Bucket: string;
3333
ContinuationToken?: string;
@@ -86,19 +86,11 @@ export async function sourceNodes(
8686
const objects = allBucketsObjects.reduce((acc, val) => acc.concat(val), []);
8787

8888
// create file nodes
89-
// todo touch nodes if they exist already
9089
objects?.forEach(async (object) => {
91-
const { Key, Bucket } = object;
92-
const { region } = awsConfig;
93-
9490
createNode({
9591
...object,
96-
// construct url
97-
Url: `https://s3.${
98-
region ? `${region}.` : ""
99-
}amazonaws.com/${Bucket}/${Key}`,
10092
// node meta
101-
id: createNodeId(`s3-object-${Key}`),
93+
id: createNodeId(`s3-object-${object.Key}`),
10294
parent: null,
10395
children: [],
10496
internal: {
@@ -123,9 +115,16 @@ export async function onCreateNode({
123115
}) {
124116
if (node.internal.type === "S3Object" && node.Key && isImage(node.Key)) {
125117
try {
118+
// get pre-signed URL
119+
const url = s3.getSignedUrl("getObject", {
120+
Bucket: node.Bucket,
121+
Key: node.Key,
122+
Expires: 60,
123+
});
124+
126125
// download image file and save as node
127126
const imageFile = await createRemoteFileNode({
128-
url: node.Url,
127+
url,
129128
parentNodeId: node.id,
130129
store,
131130
cache,

0 commit comments

Comments
 (0)