- Introduction
- Prerequisites
- Deployment
- Data import
- Application User Guide
- Cleaning up
- Architecture Diagram
- Security
- AWS Guidance
- Blog
- Contributors
- FAQ
- License
Manufacturing organizations have vast amounts of knowledge dispersed across the product lifecycle, which can result in limited visibility, knowledge gaps, and the inability to continuously improve. A digital thread offers an integrated approach to combine disparate data sources across enterprise systems to drive traceability, accessibility, collaboration, and agility.
In this sample project, learn how to create an intelligent manufacturing digital thread using a combination of knowledge graph and generative AI technologies based on data generated throughout the product lifecycle, and their interconnected relationship. Explore use cases and discover actionable steps to start your intelligent digital thread journey using graph and generative AI on AWS.
To execute the steps outlined in this post, you will require the following:
- An AWS account – how to create a new AWS account
- Access corresponding Foundation Models in Amazon Bedrock - how to manage model access in Amazon Bedrock.
- Create and configure the local environment
- Ensure you have enough capacity for the creation of 3 Elastic IP's.
- The host platform should be linux/x86_64.
-
Clone the repository into your environment
git clone https://github.com/aws-solutions-library-samples/guidance-for-digital-thread-using-graph-and-generative-ai-on-aws.git cd guidance-for-digital-thread-using-graph-and-generative-ai-on-aws
-
To deploy this app, run:
chmod +x deploy-script.sh ./deploy-script.sh
The deploy-script.sh will set up the following resources in your account:
- Amazon Cognito User pool with a demo user account
- Amazon Neptune Serverless cluster
- Amazon Neptune workbench Sagemaker notebook
- A VPC
- Subnets/Security Groups
- Application Load Balancer
- Amazon ECR Repository
- ECS Cluster & Service running on AWS Fargate
In case if you are asked about the AWS credentials as shown below. Please read Configure AWS credentials.
Which credentials would you like to use to create demo? [Use arrows to move, type to filter, ? for more help] > Enter temporary credentials [profile default]
-
Visit the URL after AWS Copilot deployment to chat with the digital thread.
✔ Deployed service genai-chatbot-app. Recommended follow-up action: - Your service is accessible at http://genai--Publi-xxxxxxx-111111111.xx-xxxx-x.elb.amazonaws.com over the internet.
Newly deployed Amazon Neptune clusters does not contain any data. To showcase the interaction between Amazon Bedrock Gen AI and Neptune knowledge graph based Digital Thread, please follow the below steps to import the sample data from src/knowledge-graph/data/ into the graph database.
-
Run below bash script to create s3 bucket and upload src/knowledge-graph/data/ files into Amazon S3
ACCOUNT_ID=$(aws sts get-caller-identity --query "Account" --output text) S3_BUCKET_NAME="mfg-digitalthread-data-${ACCOUNT_ID}" aws s3 mb "s3://$S3_BUCKET_NAME" aws s3 cp ./src/knowledge-graph/data/ s3://$S3_BUCKET_NAME/sample_data/ --recursive
-
Visit Neptune Workbench notebook Jupyter notebook.
From AWS Console:
- Sign in to the AWS Management Console, and open the Amazon Neptune console at https://console.aws.amazon.com/neptune/home
- In the navigation pane on the left, choose Notebooks.
- Select the notebook deployed by the
deploy-script.sh
CloudFormation - Choose Action -> Open Jupyter
From URL in CloudFormation stack:
- Sign in to the AWS Management Console, and open the AWS CloudFormation console at https://console.aws.amazon.com/cloudformation/
- In the navigation pane on the left, choose Stacks.
- Select the stack
mfg-dt-neptune
- In the right pane, select Outputs tab
- Find
NeptuneSagemakerNotebook
Key to find the URL of the Neptune Sagemaker Notebook. (e.g. https://aws-neptune-notebook-for-neptunedbcluster-xxxxxxxx.notebook.xx-xxxx-x.sagemaker.aws/)
-
After you go into Jupyter notebook, click on
Upload
button on the right top corner and upload src/knowledge-graph/mfg-neptune-bulk-import.ipynb file into the Neptune notebook. (PS: clickupload
blue button to confirm uploading) -
Go into
mfg-neptune-bulk-import.ipynb
and follow the steps inside the notebook to load the sample data into the Neptune database. -
Successful data import will generate the below knowledge graph.
-
You will be asked to login with the Cognito user. In this demo, a sample user
demo_user
will be created with the temporary PasswordTempPassw0rd!
.
Reset Password is required when you login for the first time. Please make sure you follow the password guidelines.
-
The main page will be displayed and you can chat with the digital thread Gen AI and graph application.
Sample questions can be found by expanding the
Example questions
menu.
Attention: All data in Amazon Neptune will be lost after cleaning up.
Since this demo sets up resources in your account, let's delete them so you don't get charged.
The cleanup-script.sh will delete the following resources in your account: > * Amazon Cognito User pool with a demo > * Amazon Neptune Serverless cluster > * Amazon Neptune workbrench Sagemaker notebook > * A VPC > * Subnets/Security Groups > * Application Load Balancer > * Amazon ECR Repositories > * ECS Cluster & Service running on AWS Fargate
chmod +x cleanup-script.sh
./cleanup-script.sh
Input 'y' to confirm cleanup:
This script is to clean up the Manufacturing Digital thread (Graph and Generative AI) demo application.
Are you sure to delete the demo application? (y/n): y
Are you sure you want to delete application genai-chatbot-app? [? for help] (y/N) y
Finally, You will get a message "CloudFormation is being deleted. It will be removed in minutes. Please check the CloudFormation console https://console.aws.amazon.com/cloudformation/home".
It will take 10-15 minutes to cleanup the resources in your account.
Detailed description
- Identify key stakeholders in the manufacturing organization: To embark on a successful journey towards implementing cutting-edge technologies like Generative AI, graphs, and digital thread, it's essential to identify key stakeholders within the manufacturing organization. This includes design engineering, manufacturing engineering, supply chain professionals, operations teams, CXOs, and IT experts. Understanding their distinct business interests and use cases lays the foundation for a connected digital thread.
- Identify data sources for building the Digital Thread: Determine the fundamental data sources required to build a comprehensive digital thread using graph technologies. These may include Product Lifecycle Management (PLM), Enterprise Resource Planning (ERP), Manufacturing Execution Systems/Manufacturing Operations Management (MES/MOM), Customer Relationship Management (CRM), and other enterprise applications. By identifying these sources and data elements, enterprises can ensure the inclusion of critical data points for a holistic view of their operations. In this sample code, we have provided a few list of objects from PLM, ERP, MES and their interconnected relationships.
- Upload Data to S3 Upload the data to Amazon Simple Storage Service (Amazon S3). This scalable and durable cloud storage solution provides a secure repository for the collected data, setting the stage for further processing and analysis.
- Use Bulk Loader to load data into Amazon Neptune database Leverage Amazon Neptune Bulk loader capability to load the data stored in Amazon S3 into Amazon Neptune graph database. The necessary schema and relationships are created within Neptune to create a knowledge graph and provides the basis for Gen AI queries.
- Create User interface Create a front end by combining Streamlit App, Amazon Elastic Container Service (ECS) with Fargate for container orchestration, Amazon Elastic Container Registry (ECR) for managing container images, Elastic Load Balancer (ELB) for efficient traffic distribution, and Amazon Cognito for secure user authentication. This comprehensive setup, orchestrated with AWS Copilot CLI, ensures a scalable, secure, and responsive user interface, facilitating a seamless user experience for stakeholders interacting with the digital thread and linked manufacturing data.
- Establish knowledge graph, LLM connection and orchestrate using Langchain.
- Establish the linkage between Amazon Bedrock (Claude 3.5 Sonnet), Amazon Neptune and orchestrate the integration seamlessly with Langchain. The orchestrator coordinates the entire process of generating the query from the foundation model, executing the query against the knowledge graph and return the results in natural language to the user.
- If you need HTTPS connection, please create a new SSL/TLS certificate in AWS Certificate Manager (ACM) and associate with a load balancer. How to create SSL/TLS Certificate and associate with load balancer
- Use Web Application Firewall (AWS WAF) which helps to protect your web applications from common application-layer exploits that can affect availability or consume excessive resources.
- Periodic review of your IAM roles/users is recommended to ensure that they grant the minimum privileges required for the function to apply least-privilege permission. You may also use IAM Access Analyzer to identify unused access.
- Always grant the minimum required permissions in Security Groups. Neptune Security
- Please follow the Security in the cloud section of the shared responsibility model. Shared Responsibility Model
See CONTRIBUTING for more information.
For AWS Guidance, please visit Guidance for Digital Thread Using Graph and Generative AI on AWS
Blog will be released in April 2024.
-
Can i execute the cleanup-script.sh if the neptune cluster is in stopped state?
No. Cloudformation deletion will fail with the error "Db cluster neptunedbcluster is in stopped state". Please start the neptune cluster either through the aws console or cli command before proceeding with the cleanup.
-
What to do when the CloudFormation failed to create neptune cluster with the error "The following resource(s) failed to create: [ElasticIP3, ElasticIP1, ElasticIP2]"?
Before running the Neptune Cloudformation template, ensure you have enough capacity for the creation of 3 Elastic IP's. Verify the number of Elastic IP's in the aws console https://console.aws.amazon.com/ec2/home?#Addresses: before deploying the script.
-
Can i create a new user apart from the demo_user?
Yes. You can navigate to the AWS Cognito user pool and create a new user using the aws console or through cli.
-
I got the error "jq: command not found" while running the deploy-script.sh. How to fix?
Please visit Install jq page for more information.
-
What do i do if i get a warning 'The requested image's platform (linux/arm64/v8) does not match the detected host platform (linux/amd64) and no specific platform was requested' followed by a failure during copilot deploy?
This error can be resolved by deploying the script from arm64 based instance. Please see the platform attribute in the manifest.yml file present under copilot/genai-chatbot-app. The platform attribute is set to linux/arm64.
-
Can this solution be adapted for use in other domains, and if so, what is the process?
Step 1: Identify domain specific customer problem.
Step 2: Identify relevant stakeholders.
Step 3: Understand the problem and create questions.
Step 4: Identify the relevant System and data.
Step 5: Create the edges and vertices csv files and place it in the knowledge-graph/data/edges and knowledge-graph/data/vertices folders.
Step 6: Load the files using S3 loader and run the neptune statistics using src/knowledge-graph/mfg-neptune-bulk-import.ipynb
Step 7: Chat with the graph.
Step 8: If the response is inaccurate, please update the prompt template by providing an example query and the corresponding answer.
When engaging with customers to understand their needs, use the below template.
-
I made minor adjustments in the existing graph by adding new edges and vertices, but the chat application doesn't seem to recognize the changes. What could be the reason for this issue?
Langchain Neptune Graph gets the node and edge labels from the Neptune statistics summary. Neptune statistics are currently re-generated whenever either more than 10% of data in your graph has changed or when the latest statistics are more than 10 days old. To solve the problem, please run the statistics command "%statistics --mode refresh" immediately after loading any additional changes (Refer mfg-neptune-bulk-import.ipynb).
-
How do i reset neptune DB?
Please follow the "Workbench magic commands" outlined in this blog.
-
What is the procedure for stopping the Neptune cluster and notebook to avoid incurring costs?
It is a best practice to stop the Neptune cluster and notebook when you are not using it. Follow the steps outlined below.
-
How much does Amazon Neptune and Amazon Bedrock cost?
Please refer the Neptune Serverless pricing and Amazon Bedrock Pricing for Anthropic models.
-
In which AWS Regions is Amazon Bedrock available?
Please refer this page for more details.
-
I need to know more about Amazon Neptune and Amazon Bedrock.
Please see the Amazon Bedrock and Amazon Neptune product page for more information.
This library is licensed under the MIT-0 License. See the LICENSE file.