pica-langchain

Install the Python SDK to unlock powerful tools for LangChain

Installation

Install the Pica LangChain SDK:

pip install pica-langchain

Configuration

The PicaClientOptions class allows you to configure the Pica client with the following options:

OptionTypeRequiredDefaultDescription
server_urlstrNohttps://api.picaos.comURL for self-hosted Pica server.
connectorsList[str]NoAll available connectorsList of connector keys to filter by. Pass [”*”] to initialize all available connectors, or specific connector keys to filter. If empty, no connections will be initialized.
identitystrNoNoneFilter connections by specific identifier.
identity_type"user", "team", "project", "organization"NoNoneFilter connections by identity type.
authkitBooleanNofalseIf true, the SDK will use Authkit to connect to prompt the user to connect to a platform that they do not currently have access to

The create_pica_agent function allows customizing the following parameters:

OptionTypeRequiredDefaultDescription
verboseboolNoFalseWhether to print verbose logs.
system_promptstrNoNoneA custom system prompt to append to the default system prompt.
agent_typeAgentTypeNoOPENAI_FUNCTIONSThe type of agent to create.
toolsList[BaseTool]NoNoneA list of tools to use in the agent.
return_intermediate_stepsboolNoFalseWhether to return the intermediate steps of the agent.

Usage

This is a getting started example using Pica with LangChain and OpenAI.

Ensure you have the following environment variables set:

export PICA_SECRET="your-pica-secret"
export OPENAI_API_KEY="your-openai-api-key"
import os
from langchain_openai import ChatOpenAI
from langchain.agents import AgentType
from pica_langchain import PicaClient, create_pica_agent
from pica_langchain.models import PicaClientOptions

def main():
    try:
        pica_client = PicaClient(
            secret=os.environ["PICA_SECRET"],
            options=PicaClientOptions(
                connectors=["*"] # Initialize all available connections or pass specific connector keys
                # server_url="https://my-self-hosted-server.com",
                # identity="user-id",
                # identity_type="user"
            )
        )
        
        llm = ChatOpenAI(
            temperature=0,
            model="gpt-4o",
        )

        # Create an agent with Pica tools
        agent = create_pica_agent(
            client=pica_client,
            llm=llm,
            agent_type=AgentType.OPENAI_FUNCTIONS,
        )

        # Execute a multi-step workflow using the GitHub Connector
        result = agent.invoke({
            "input": (
                "What connections do I have access to?"
            )
        })
        
        print(f"\nWorkflow Result:\n {result}")
    
    except Exception as e:
        print(f"ERROR: An unexpected error occurred: {e}")
        sys.exit(1)


if __name__ == "__main__":
    main()

Logging

The Pica LangChain SDK uses the logging module to log messages. The log level can be set using the PICA_LOG_LEVEL environment variable.

The following log levels are available:

  • debug
  • info
  • warning
  • error
  • critical
export PICA_LOG_LEVEL="debug"

Examples

Code Solutions

import os
from langchain_openai import ChatOpenAI
from langchain.agents import AgentType
from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler
from pica_langchain import PicaClient, create_pica_agent
from pica_langchain.models import PicaClientOptions

pica_client = PicaClient(
    secret=os.environ["PICA_SECRET"],
    options=PicaClientOptions(
        connectors=["*"], # Initialize all available connections for this example
    )
)

llm_with_handler = ChatOpenAI(
    temperature=0,
    model="gpt-4o",
    streaming=True,
    callbacks=[StreamingStdOutCallbackHandler()]
)

agent_with_handler = create_pica_agent(
    client=pica_client,
    llm=llm_with_handler,
    agent_type=AgentType.OPENAI_FUNCTIONS,
)

for chunk in agent_with_handler.stream({
    "input": "List three platforms available in Pica."
}):
    print(chunk)

Workflows

from langchain_openai import ChatOpenAI
from langchain.agents import AgentType
from pica_langchain import PicaClient, create_pica_agent

pica_client = PicaClient(
    secret="YOUR_PICA_SECRET",
    options=PicaClientOptions(
        connectors=["*"]
    )
)

llm = ChatOpenAI(temperature=0, model="gpt-4o")

agent = create_pica_agent(
    client=pica_client,
    llm=llm,
    agent_type=AgentType.OPENAI_FUNCTIONS
)

result = agent.invoke({
    "input": (
        "Star the picahq/pica repo in github. "
        "Then, list 5 of the repositories that I have starred in github."
    )
})

print(f"Result: {result}")

GitHub

@picahq/pica-langchain

Check out our GitHub repository to explore the code, contribute, or raise issues.