npm package discovery and stats viewer.

Discover Tips

  • General search

    [free text search, go nuts!]

  • Package details

    pkg:[package-name]

  • User packages

    @[username]

Sponsor

Optimize Toolset

I’ve always been into building performant and accessible sites, but lately I’ve been taking it extremely seriously. So much so that I’ve been building a tool to help me optimize and monitor the sites that I build to make sure that I’m making an attempt to offer the best experience to those who visit them. If you’re into performant, accessible and SEO friendly sites, you might like it too! You can check it out at Optimize Toolset.

About

Hi, 👋, I’m Ryan Hefner  and I built this site for me, and you! The goal of this site was to provide an easy way for me to check the stats on my npm packages, both for prioritizing issues and updates, and to give me a little kick in the pants to keep up on stuff.

As I was building it, I realized that I was actually using the tool to build the tool, and figured I might as well put this out there and hopefully others will find it to be a fast and useful way to search and browse npm packages as I have.

If you’re interested in other things I’m working on, follow me on Twitter or check out the open source projects I’ve been publishing on GitHub.

I am also working on a Twitter bot for this site to tweet the most popular, newest, random packages from npm. Please follow that account now and it will start sending out packages soon–ish.

Open Software & Tools

This site wouldn’t be possible without the immense generosity and tireless efforts from the people who make contributions to the world and share their work via open source initiatives. Thank you 🙏

© 2026 – Pkg Stats / Ryan Hefner

@langchain/aws

v1.1.1

Published

LangChain AWS integration

Readme

@langchain/aws

This package contains the LangChain.js integrations for AWS through their SDK.

Installation

npm install @langchain/aws

This package, along with the main LangChain package, depends on @langchain/core. If you are using this package with other LangChain packages, you should make sure that all of the packages depend on the same instance of @langchain/core. You can do so by adding appropriate fields to your project's package.json like this:

{
  "name": "your-project",
  "version": "0.0.0",
  "dependencies": {
    "@langchain/aws": "^0.0.1",
    "@langchain/core": "^0.3.0"
  },
  "resolutions": {
    "@langchain/core": "^0.3.0"
  },
  "overrides": {
    "@langchain/core": "^0.3.0"
  },
  "pnpm": {
    "overrides": {
      "@langchain/core": "^0.3.0"
    }
  }
}

The field you need depends on the package manager you're using, but we recommend adding a field for the common yarn, npm, and pnpm to maximize compatibility.

Chat Models

This package contains the ChatBedrockConverse class, which is the recommended way to interface with the AWS Bedrock Converse series of models.

To use, install the requirements, and configure your environment following the traditional authentication methods.

export BEDROCK_AWS_REGION=
export BEDROCK_AWS_SECRET_ACCESS_KEY=
export BEDROCK_AWS_ACCESS_KEY_ID=

Alternatively, set the AWS_BEARER_TOKEN_BEDROCK environment variable locally for API Key authentication. For additional API key details, refer to docs.

export BEDROCK_AWS_REGION=
export AWS_BEARER_TOKEN_BEDROCK=

Then initialize

import { ChatBedrockConverse } from "@langchain/aws";

const model = new ChatBedrockConverse({
  region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
  credentials: {
    secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY,
    accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID,
  },
});

const response = await model.invoke(new HumanMessage("Hello world!"));

Using Application Inference Profiles

AWS Bedrock Application Inference Profiles allow you to define custom endpoints that can route requests across regions or manage traffic for your models.

You can use an inference profile ARN by passing it to the applicationInferenceProfile parameter. When provided, this ARN will be used for the actual inference calls instead of the model ID:

import { ChatBedrockConverse } from "@langchain/aws";

const model = new ChatBedrockConverse({
  region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
  model: "anthropic.claude-3-haiku-20240307-v1:0",
  applicationInferenceProfile:
    "arn:aws:bedrock:eu-west-1:123456789102:application-inference-profile/fm16bt65tzgx",
  credentials: {
    secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY,
    accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID,
  },
});

const response = await model.invoke(new HumanMessage("Hello world!"));

Important: You must still provide the model parameter with the actual model ID (e.g., "anthropic.claude-3-haiku-20240307-v1:0"), even when using an inference profile. This ensures proper metadata tracking in tools like LangSmith, including accurate cost and latency measurements per model. The applicationInferenceProfile ARN will override the model ID only for the actual inference API calls.

Note: AWS does not currently provide an API to programmatically retrieve the underlying model from an inference profile ARN, so it's the user's responsibility to ensure the model parameter matches the model configured in the inference profile.

Streaming

import { ChatBedrockConverse } from "@langchain/aws";

const model = new ChatBedrockConverse({
  region: process.env.BEDROCK_AWS_REGION ?? "us-east-1",
  credentials: {
    secretAccessKey: process.env.BEDROCK_AWS_SECRET_ACCESS_KEY,
    accessKeyId: process.env.BEDROCK_AWS_ACCESS_KEY_ID,
  },
});

const response = await model.stream(new HumanMessage("Hello world!"));

Development

To develop the AWS package, you'll need to follow these instructions:

Install dependencies

pnpm install

Build the package

pnpm build

Or from the repo root:

pnpm build --filter @langchain/aws

Run tests

Test files should live within a tests/ file in the src/ folder. Unit tests should end in .test.ts and integration tests should end in .int.test.ts:

$ pnpm test
$ pnpm test:int

Lint & Format

Run the linter & formatter to ensure your code is up to standard:

pnpm lint && pnpm format

Adding new entrypoints

If you add a new file to be exported, either import & re-export from src/index.ts, or add it to the exports field in the package.json file and run pnpm build to generate the new entrypoint.

Publishing

After running pnpm build, publish a new version with:

$ npm publish