Skip to content

Latest commit

 

History

History

spring-boot-ollama-sample

Folders and files

NameName
Last commit message
Last commit date

parent directory

..
 
 
 
 
 
 
 
 

How to use the Ollama Java Client Spring Boot Starter

Steps:

  1. Create an Spring Boot 3 project
  2. Add the following dependency to the pom.xml file:
<dependency>
    <groupId>es.omarall</groupId>
    <artifactId>ollama-java-client-starter</artifactId>
    <version>1.0-SNAPSHOT</version>
</dependency>
  1. Add the required properties to the application.properties or application.yml file:
ollama:
  base-url: 'http://localhost:11434'
  call-timeout: 0
  connect-timeout: 0
  read-timeout: 0
  write-timeout: 0
  1. Now you could inject the OllamaService bean in any bean of your application:
    @Bean
    ApplicationRunner runner(OllamaService ollamaService,SimpleStringStreamResponseProcessor streamResponseProcessor){
            return args->{

            // Embedding request
            EmbeddingResponse embeddingResponse=ollamaService.embed(EmbeddingRequest.builder()
            .model(MODEL_NAME)
            .prompt("Dare to embed this text?")
            .build());
            log.info("******* Ollama Embedding response: {}",embeddingResponse.getEmbedding());

            log.info("******* (wait for it)");

            // Completion request
            Arrays.asList("What is the capital city of Spain?",
            "Translate this text to Spanish: 'I love cookies!'")
            .forEach(prompt->{
            CompletionResponse response=ollamaService.completion(CompletionRequest.builder()
            .model(MODEL_NAME).prompt(prompt).build());
            log.info("******* Ollama Completion response: {}",response.getResponse());
            });

            // Streaming completion
            ollamaService.streamingCompletion(CompletionRequest.builder()
            .model(MODEL_NAME)
            .prompt("What is the meaning of life?")
            .build(),streamResponseProcessor);

            };
            }
  1. Run it: Install Ollama and the Mistral 7b model as stated here and run the application:
mvn spring-boot:run

Ollama exchanges