The C Transformers library provides Python bindings for GGML models.This example goes over how to use LangChain to interact with C Transformersmodels.Install
Copy
Ask AI
%pip install -qU ctransformers
Load Model
Copy
Ask AI
from langchain_community.llms import CTransformersllm = CTransformers(model="marella/gpt-2-ggml")
Generate Text
Copy
Ask AI
print(llm.invoke("AI is going to"))
Streaming
Copy
Ask AI
from langchain_core.callbacks import StreamingStdOutCallbackHandlerllm = CTransformers( model="marella/gpt-2-ggml", callbacks=[StreamingStdOutCallbackHandler()])response = llm.invoke("AI is going to")
LLMChain
Copy
Ask AI
from langchain.chains import LLMChainfrom langchain_core.prompts import PromptTemplatetemplate = """Question: {question}Answer:"""prompt = PromptTemplate.from_template(template)llm_chain = LLMChain(prompt=prompt, llm=llm)response = llm_chain.run("What is AI?")
Was this page helpful?
Assistant
Responses are generated using AI and may contain mistakes.