Categories
This guide provides step-by-step instructions on how to bridge your local Ollama instance with Crowdin.
Since Crowdin is a cloud-based platform, it cannot directly access your localhost. To bridge this gap, you can use ngrok to create a secure gateway.
First, ensure Ollama is running and configured correctly on your local machine.
1. Start Ollama. Ensure the application is active in your system tray or terminal.
2. Download a Model. Use a model that performs well with translations. Command example:
ollama pull llama3
3. Verify local access. Open http://localhost:11434 in your browser. You should see the message: "Ollama is running".
This step creates a temporary public URL that Crowdin can use to "talk" to your computer.
1. Install ngrok. Download it from ngrok.com.
2. Start the tunnel. Run the following command in your terminal:
ngrok http 11434
3. Copy the Forwarding URL. Look for the line starting with Forwarding. It will look like https://a1b2-c3d4.ngrok-free.app.
Note: Keep this terminal window open. If you close it, the connection will break.
Now, we will use Crowdin's native OpenAI connector to point toward your local machine.
1. Open Crowdin. Navigate to your project and go to Settings > AI.
2. Select Provider. Choose the OpenAI (Native) connector.
3. Configure Connection:
Crowdin is a platform that helps you manage and translate content into different languages. Integrate Crowdin with your repo, CMS, or other systems. Source content is always up to date for your translators, and translated content is returned automatically.
Learn More