Replies: 1 comment
-
🤖 Hi there, Yes, it is indeed possible to implement the Joint Text-to-SQL and Semantic Search feature using a pre-built schema with data already in it. The SQLJoinQueryEngine class in the provided code is designed to work with an existing SQL database. It does not create the SQL table natively but rather queries an existing SQL database and another query engine in sequence. The SQLJoinQueryEngine class takes a sql_query_tool as an argument during initialization. This sql_query_tool is expected to be an instance of BaseSQLTableQueryEngine or NLSQLTableQueryEngine, both of which are designed to interact with an existing SQL database. Here is the relevant code snippet that validates the sql_query_tool: if not isinstance(
sql_query_tool.query_engine,
(BaseSQLTableQueryEngine, NLSQLTableQueryEngine),
):
raise ValueError(
"sql_query_tool.query_engine must be an instance of "
"BaseSQLTableQueryEngine or NLSQLTableQueryEngine"
) And here is the code snippet where the SQL database is queried: # first query SQL database
sql_response = self._sql_query_tool.query_engine.query(query_bundle) These snippets show that the SQLJoinQueryEngine class and its subclass SQLAutoVectorQueryEngine are designed to work with an existing SQL database, and do not create the SQL table natively. To connect the SQLJoinQueryEngine or SQLAutoVectorQueryEngine to an existing SQL database, such as Snowflake, you would need to follow these steps:
from llama_index import (
VectorStoreIndex,
SimpleDirectoryReader,
ServiceContext,
StorageContext,
SQLDatabase,
WikipediaReader,
)
from sqlalchemy import (
create_engine,
MetaData,
Table,
Column,
String,
Integer,
select,
column,
)
from llama_index.query_engine import SQLJoinQueryEngine, RetrieverQueryEngine
from llama_index.tools.query_engine import QueryEngineTool
from llama_index.tools import ToolMetadata
from llama_index.indices.vector_store import VectorIndexAutoRetriever
from llama_index.query_engine import SubQuestionQueryEngine
from llama_index.indices.struct_store.sql_query import NLSQLTableQueryEngine
engine = create_engine("your_snowflake_connection_string", future=True)
metadata_obj = MetaData() Replace
sql_database = SQLDatabase(engine, include_tables=["your_table_name"]) Replace
sql_query_engine = NLSQLTableQueryEngine(
sql_database=sql_database,
tables=["your_table_name"],
)
query_engine = SQLJoinQueryEngine(
sql_tool, s_engine_tool, service_context=service_context
) In this step, Please note that the code snippets provided are based on the context provided and may need to be adjusted based on your specific use case and environment. I hope this helps! Let me know if you have any other questions. Sources
This response is meant to be useful and save you time. It is not meant to be a precise solution, but rather a starting point for your own research. Help me be more useful! Please leave a 👍 if this is helpful and 👎 if it is irrelevant. If you want to continue the conversation, start your reply with @dosu-bot. |
Beta Was this translation helpful? Give feedback.
-
Question Validation
Question
Hi I'm trying to implement the Joint Text-to-SQL and Semantic Search feature combining structured and unstructured data. In the example provided you natively create the SQL table but I was wondering is a way to import and use a pre-built schema with data already in it. Would It be through using the database tool?
Beta Was this translation helpful? Give feedback.
All reactions