Skip to content

docs: update the readme to note Google provided integrations #132

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Closed
wants to merge 2 commits into from
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
40 changes: 32 additions & 8 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -9,13 +9,13 @@

The `langchain-postgres` package implementations of core LangChain abstractions using `Postgres`.

The package is released under the MIT license.
The package is released under the MIT license.

Feel free to use the abstraction as provided or else modify them / extend them as appropriate for your own application.

## Requirements

The package currently only supports the [psycogp3](https://www.psycopg.org/psycopg3/) driver.
The package currently only supports the [psycopg3](https://www.psycopg.org/psycopg3/) driver.

## Installation

Expand All @@ -25,20 +25,20 @@ pip install -U langchain-postgres

## Change Log

0.0.6:
0.0.6:
- Remove langgraph as a dependency as it was causing dependency conflicts.
- Base interface for checkpointer changed in langgraph, so existing implementation would've broken regardless.

## Usage

### ChatMessageHistory

The chat message history abstraction helps to persist chat message history
The chat message history abstraction helps to persist chat message history
in a postgres table.

PostgresChatMessageHistory is parameterized using a `table_name` and a `session_id`.
`PostgresChatMessageHistory` is parameterized using a `table_name` and a `session_id`.

The `table_name` is the name of the table in the database where
The `table_name` is the name of the table in the database where
the chat messages will be stored.

The `session_id` is a unique identifier for the chat session. It can be assigned
Expand Down Expand Up @@ -80,6 +80,30 @@ print(chat_history.messages)
```


### Vectorstore
### Vector Store

See example for the [PGVector vectorstore here](https://github.com/langchain-ai/langchain-postgres/blob/main/examples/vectorstore.ipynb)
Learn how to use the [`PGVector` vectorstore](https://github.com/langchain-ai/langchain-postgres/blob/main/examples/vectorstore.ipynb).

## Google Cloud Integrations
Copy link
Collaborator

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

2 requests:

  1. Could you add the standard test suite to the google packages and confirm that they pass the tests? It should be very quick to confirm and run the tests (~8 lines of code).

https://github.com/langchain-ai/langchain-postgres/blob/main/tests/unit_tests/test_vectorstore_standard_tests.py

  1. Could you document the lack of db agnostic filtering support? it's a feature a lot of users look for (easy filtering). I think if we're recommending something from this package, users will likely expect feature parity

Copy link
Collaborator Author

@averikitsch averikitsch Oct 28, 2024

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

  1. We will have to make a few updates to our vector store ie support the method (a)get_by_ids() and support Document id fields and in metadata.
  2. I have added the note in the Readme. We will also look to support this


[Google Cloud](https://python.langchain.com/docs/integrations/providers/google/) provides Vector Store, Chat Message History, and Data Loader integrations for [AlloyDB](https://cloud.google.com/alloydb) and [Cloud SQL](https://cloud.google.com/sql) for PostgreSQL databases via the following PyPi packages:

* [`langchain-google-alloydb-pg`](https://github.com/googleapis/langchain-google-alloydb-pg-python)

* [`langchain-google-cloud-sql-pg`](https://github.com/googleapis/langchain-google-cloud-sql-pg-python)

Using the Google Cloud provided integrations provides the following benefits:

- **Enhanced security**: Easily and securely connect to Google Cloud databases utilizing IAM for authorization and database authentication without needing to manage SSL certificates, configure firewall rules, or enable authorized networks.
- **Simplified and Secure Connections:** Connect to Google Cloud databases effortlessly using the instance name instead of complex connection strings. The integrations creates a secure connection pool that can be easily shared across your application using the `engine` object.

Learn how to [migrate a `PGVector` vector store to `AlloyDBVectorStore`](https://github.com/googleapis/langchain-google-alloydb-pg-python/blob/main/samples/migrations/migrate_pgvector_to_alloydb.md) to gain the following benefits:

- **Simplified management**: a single table contains data corresponding to a single collection, making it easier to query, update, and maintain.
- **Schema flexibility**: allow users to add tables into any database schema.
- **Improved performance**: using a single-table schema can lead to faster query execution, especially for large collections.
- **Improved metadata handling**: store metadata in columns instead of JSON, resulting in significant performance improvements.
- **Clear separation**: clearly separate table and extension creation, allowing for distinct permissions and streamlined workflows.
- **Better integration with AlloyDB**: take advantage of AlloyDB's advanced indexing and scalability capabilities.

Note: The vector store currently supports metadata filtering using Postgres syntax instead of [database agnostic filter syntax](https://python.langchain.com/docs/how_to/query_constructing_filters/).