This year, we're adding a powerful Vector Search capability to the InterSystems IRIS Data Platform, to help you innovate faster and build intelligent applications powered by Generative AI. At the center of the new capability is a new VECTOR
native datatype for IRIS SQL, along with similarity functions that leverage optimized chipset instructions (SIMD).
In the demos folder, sql_demo.ipynb, langchain_demo.ipynb, llama_demo.ipynb are built on the community sqlalchemy python package
sql_dbapi_demo.ipynb uses the official dbapi package.
The python .whl file is provided in the repo. It can also be downloaded here.
-
Clone the repo
git clone https://github.com/intersystems-community/hackathon-2024.git
-
Install IRIS Community Edtion in a container:
docker run -d --name iris-comm -p 1972:1972 -p 52773:52773 -e IRIS_PASSWORD=demo -e IRIS_USERNAME=demo intersystemsdc/iris-community:latest
ℹ️ After running the above command, you can access the System Management Portal via http://localhost:52773/csp/sys/UtilHome.csp. Please note you may need to configure your web server separately when using another product edition.
-
Create a Python environment and activate it (conda, venv or however you wish) For example:
conda:
conda create --name iris-env python=3.10 conda activate
or
venv (Windows):
python -m venv iris-env .\iris-env\Scripts\Activate
or
venv (Unix):
python -m venv iris-env source ./iris-env/bin/activate
-
Install packages for all demos:
pip install -r requirements.txt
-
Install Intersystem's DB API connector (run this from the root of the repo):
pip install intersystems_irispython-3.2.0-py3-none-any.whl
-
For
langchain_demo.ipynb
andllama_demo.ipynb
, you need an OpenAI API Key. Create a.env
file in this repo to store the key:OPENAI_API_KEY=xxxxxxxxx
- Navigate to http://localhost:52773/csp/sys/UtilHome.csp, login with username: demo, password: demo (or whatever you configured)
- Change the namespace (on the top left) from %SYS to USER
- On the left navigation pane, click 'System Explorer'
- Click 'SQL' -> 'Go'
- Here, you can execute SQL queries. You can also view the tables by clicking the relevant table on the left, under 'Tables', and then clicking 'Open Table' (above the SQL query box)
IRIS SQL now supports vector search (with other columns)! In this demo, we're searching a whiskey dataset for whiskeys that are priced < $100 and have a taste description similar to "earthy and creamy taste".
IRIS now has a langchain integration as a VectorDB! In this demo, we use the langchain framework with IRIS to ingest and search through a document.
IRIS now has a llama_index integration as a VectorDB! In this demo, we use the llama_index framework with IRIS to ingest and search through a document.
If you need to use search with filters, use IRIS SQL. This is the most flexible way to build RAG.
If you're building a genAI app that uses a variety of tools (agents, chained reasoning, api calls), go for langchain.
If you're building a RAG app, go for llama_index.
The fastest and easiest way to contact any InterSystems Mentor is via Slack or Discord - feel free to ask any questions about our technology, or about your project in general!
Uses langchain-iris to search Youtube Audio transcriptions
Original IRIS langhain demo, that runs the containerized IRIS in the notebook
Original IRIS llama_index demo, that runs the containerized IRIS in the notebook
Official page for InterSystems Documentation