mirror of
https://github.com/hwchase17/langchain.git
synced 2025-06-26 16:43:35 +00:00
Updated partners/fireworks README (#18267)
## PR title partners: changed the README file for the Fireworks integration in the libs/partners/fireworks folder ## PR message Description: Changed the README file of partners/fireworks following the docs on https://python.langchain.com/docs/integrations/llms/Fireworks The README includes: - Brief description - Installation - Setting-up instructions (API key, model id, ...) - Basic usage Issue: https://github.com/langchain-ai/langchain/issues/17545 Dependencies: None Twitter handle: None
This commit is contained in:
parent
df234fb171
commit
412148773c
@ -2,10 +2,75 @@
|
||||
|
||||
This is the partner package for tying Fireworks.ai and LangChain. Fireworks really strive to provide good support for LangChain use cases, so if you run into any issues please let us know. You can reach out to us [in our Discord channel](https://discord.com/channels/1137072072808472616/)
|
||||
|
||||
## Basic LangChain-Fireworks example
|
||||
|
||||
## Installation
|
||||
|
||||
To use the `langchain-fireworks` package, follow these installation steps:
|
||||
|
||||
```bash
|
||||
pip install langchain-fireworks
|
||||
```
|
||||
|
||||
|
||||
## Advanced
|
||||
|
||||
## Basic usage
|
||||
|
||||
### Setting up
|
||||
|
||||
1. Sign in to [Fireworks AI](http://fireworks.ai/) to obtain an API Key to access the models, and make sure it is set as the `FIREWORKS_API_KEY` environment variable.
|
||||
|
||||
Once you've signed in and obtained an API key, follow these steps to set the `FIREWORKS_API_KEY` environment variable:
|
||||
- **Linux/macOS:** Open your terminal and execute the following command:
|
||||
```bash
|
||||
export FIREWORKS_API_KEY='your_api_key'
|
||||
```
|
||||
**Note:** To make this environment variable persistent across terminal sessions, add the above line to your `~/.bashrc`, `~/.bash_profile`, or `~/.zshrc` file.
|
||||
|
||||
- **Windows:** For Command Prompt, use:
|
||||
```cmd
|
||||
set FIREWORKS_API_KEY=your_api_key
|
||||
```
|
||||
|
||||
2. Set up your model using a model id. If the model is not set, the default model is `fireworks-llama-v2-7b-chat`. See the full, most up-to-date model list on [fireworks.ai](https://fireworks.ai/models).
|
||||
|
||||
```python
|
||||
import getpass
|
||||
import os
|
||||
|
||||
# Initialize a Fireworks model
|
||||
llm = Fireworks(
|
||||
model="accounts/fireworks/models/mixtral-8x7b-instruct",
|
||||
base_url="https://api.fireworks.ai/inference/v1/completions",
|
||||
)
|
||||
```
|
||||
|
||||
|
||||
### Calling the Model Directly
|
||||
|
||||
You can call the model directly with string prompts to get completions.
|
||||
|
||||
```python
|
||||
# Single prompt
|
||||
output = llm.invoke("Who's the best quarterback in the NFL?")
|
||||
print(output)
|
||||
```
|
||||
|
||||
```python
|
||||
# Calling multiple prompts
|
||||
output = llm.generate(
|
||||
[
|
||||
"Who's the best cricket player in 2016?",
|
||||
"Who's the best basketball player in the league?",
|
||||
]
|
||||
)
|
||||
print(output.generations)
|
||||
```
|
||||
|
||||
|
||||
|
||||
|
||||
|
||||
## Advanced usage
|
||||
### Tool use: LangChain Agent + Fireworks function calling model
|
||||
Please checkout how to teach Fireworks function calling model to use a [calculator here](https://github.com/fw-ai/cookbook/blob/main/examples/function_calling/fireworks_langchain_tool_usage.ipynb).
|
||||
|
||||
|
Loading…
Reference in New Issue
Block a user