mirror of
https://github.com/hwchase17/langchain.git
synced 2025-05-25 08:57:48 +00:00
# Add C Transformers for GGML Models I created Python bindings for the GGML models: https://github.com/marella/ctransformers Currently it supports GPT-2, GPT-J, GPT-NeoX, LLaMA, MPT, etc. See [Supported Models](https://github.com/marella/ctransformers#supported-models). It provides a unified interface for all models: ```python from langchain.llms import CTransformers llm = CTransformers(model='/path/to/ggml-gpt-2.bin', model_type='gpt2') print(llm('AI is going to')) ``` It can be used with models hosted on the Hugging Face Hub: ```py llm = CTransformers(model='marella/gpt-2-ggml') ``` It supports streaming: ```py from langchain.callbacks.streaming_stdout import StreamingStdOutCallbackHandler llm = CTransformers(model='marella/gpt-2-ggml', callbacks=[StreamingStdOutCallbackHandler()]) ``` Please see [README](https://github.com/marella/ctransformers#readme) for more details. --------- Co-authored-by: Dev 2049 <dev.dev2049@gmail.com> |
||
---|---|---|
.. | ||
ai21.ipynb | ||
aleph_alpha.ipynb | ||
anyscale.ipynb | ||
azure_openai_example.ipynb | ||
banana.ipynb | ||
beam.ipynb | ||
cerebriumai_example.ipynb | ||
cohere.ipynb | ||
ctransformers.ipynb | ||
deepinfra_example.ipynb | ||
forefrontai_example.ipynb | ||
google_vertex_ai_palm.ipynb | ||
gooseai_example.ipynb | ||
gpt4all.ipynb | ||
huggingface_hub.ipynb | ||
huggingface_pipelines.ipynb | ||
huggingface_textgen_inference.ipynb | ||
jsonformer_experimental.ipynb | ||
llamacpp.ipynb | ||
manifest.ipynb | ||
modal.ipynb | ||
mosaicml.ipynb | ||
nlpcloud.ipynb | ||
openai.ipynb | ||
openlm.ipynb | ||
petals_example.ipynb | ||
pipelineai_example.ipynb | ||
predictionguard.ipynb | ||
promptlayer_openai.ipynb | ||
rellm_experimental.ipynb | ||
replicate.ipynb | ||
runhouse.ipynb | ||
sagemaker.ipynb | ||
stochasticai.ipynb | ||
writer.ipynb |