Quantcast
Channel: Baeldung
Viewing all articles
Browse latest Browse all 4616

Using LangChain4j With Micronaut

$
0
0

1. Overview

LangChain4j is a Java library based on LangChain. We use it in our Java applications to integrate with LLMs.

Micronaut is a modern, JVM-based framework designed for building lightweight, modular, and fast applications. We use it to create microservices, serverless applications, and cloud-native solutions with minimal startup time and memory usage.  Its powerful dependency injection and AOT (Ahead-of-Time) compilation ensure high performance and scalability. In Micronaut, we have excellent integration with LangChain4j, allowing us to leverage the benefits of both frameworks in a single application.

In this tutorial, we’ll learn how to build AI-powered applications using LangChain4j and Micronaut. We’ll explore how straightforward it is to create powerful tools to automate our tasks.

2. Relocation Advisor Application

We’ll build a LangChain4j-integrated Micronaut application. In this application, we’ll create a simple chatbot to provide advice about potential relocation countries. The chatbot will rely on a few provided links for its information.

The application only responds to questions about the countries it’s aware of.

2.1. Dependencies

Let’s start by adding the dependencies. First, we add the langchain4j-open-ai dependency:

<dependency>
    <groupId>dev.langchain4j</groupId>
    <artifactId>langchain4j-open-ai</artifactId>
    <version>0.36.2</version>
</dependency>

Then, let’s add the Micronaut-related dependencies:

<dependency>  
    <groupId>io.micronaut.langchain4j</groupId>  
    <artifactId>micronaut-langchain4j-core</artifactId>  
    <version>0.0.1</version>  
</dependency>  
<dependency>  
    <groupId>io.micronaut.langchain4j</groupId>  
    <artifactId>micronaut-langchain4j-openai</artifactId>  
    <version>0.0.1</version>  
</dependency>

2.2. Configuration

Since we’ll use OpenAI’s LLM, let’s add the OpenAI API key to our application YAML file:

langchain4j:  
  open-ai:  
    api-key: ${OPENAI_API_KEY}

2.3. Relocation Advisor

Now, let’s create the chatbot interface:

public interface RelocationAdvisor {  
  
    @SystemMessage("""  
	  You are a relocation advisor. Answer using official tone.        
	  Provide the numbers you will get from the resources.            
	  From the best of your knowledge answer the question below regarding possible relocation.            
	  Please get information only from the following sources:             
          - https://www.numbeo.com/cost-of-living/country_result.jsp?country=Spain             
          - https://www.numbeo.com/cost-of-living/country_result.jsp?country=Romania
          And their subpages. Then, answer. If you don't have the information - answer exact text 'I don't have  
            information about {Country}'            
        """)  
    String chat(@UserMessage String question);  
}

In the RelocationAdvisor class, we define the chat() method, which processes user messages and returns a model response as a string. We use the @SystemMessage annotation to specify the chatbot’s basic behavior. In this case, we want the chatbot to act as a relocation advisor and rely solely on two specific links for information about the countries.

2.4. Relocation Advisor Factory

Now, let’s create the RelocationAdvisorFactory:

@Factory  
public class RelocationAdvisorFactory {  
  
    @Value("${langchain4j.open-ai.api-key}")  
    private String apiKey;  
  
    @Singleton  
    public RelocationAdvisor advisor() {  
        ChatLanguageModel model = OpenAiChatModel.builder()  
          .apiKey(apiKey)  
          .modelName(OpenAiChatModelName.GPT_4_O_MINI)  
          .build();  
  
       return AiServices.create(RelocationAdvisor.class, model);  
    }  
}

Using this factory, we produce the RelocationAdvisor as a bean. We use the OpenAiChatModel builder to create a model instance and then leverage AiServices to create an instance of our chatbot, based on the interface and the model instance.

2.5. Test for Case With Existing Sources

Let’s test our advisor and see what information it provides about the countries it’s aware of:

@MicronautTest(rebuildContext = true)  
public class RelocationAdvisorLiveTest {  
  
    Logger logger = LoggerFactory.getLogger(RelocationAdvisorLiveTest.class);  
  
    @Inject  
    RelocationAdvisor assistant;  
  
    @Test  
    void givenAdvisor_whenSendChatMessage_thenExpectedResponseShouldContainInformationAboutRomania() {  
        String response = assistant.chat("Tell me about Romania");  
        logger.info(response);  
        Assertions.assertTrue(response.contains("Romania"));  
    }  
  
    @Test  
    void givenAdvisor_whenSendChatMessage_thenExpectedResponseShouldContainInformationAboutSpain() {  
        String response = assistant.chat("Tell me about Spain");  
        logger.info(response);  
        Assertions.assertTrue(response.contains("Spain"));  
    }  
}

We injected an instance of our RelocationAdvisor into the test and asked it about a few countries. As expected, the response included the countries’ names.

Additionally, we logged the model’s response. Here’s what it looks like:

15:43:47.334 [main] INFO  c.b.m.l.RelocationAdvisorLiveTest - Spain has a cost of living index of 58.54, which is relatively moderate compared to other countries.   
The average rent for a single-bedroom apartment in the city center is approximately €906.52, while outside the city center, it is around €662.68...

2.6. Test Case for Missing Information in Sources

Now, let’s ask our advisor about a country it doesn’t have information on:

@Test  
void givenAdvisor_whenSendChatMessage_thenExpectedResponseShouldNotContainInformationAboutNorway() {  
    String response = assistant.chat("Tell me about Norway");  
    logger.info(response);  
    Assertions.assertTrue(response.contains("I don't have information about Norway"));  
}

In this case, our chatbot responded with a predefined message stating that it’s not aware of this country.

3. Conclusion

In this article, we’ve reviewed how to integrate LangChain4j with Micronaut. As we discovered, it’s straightforward to achieve LLM-powered functionality and integrate it into a Micronaut application. Additionally, we have good control over our AI services, allowing us to enhance them with additional behaviors and create more complex solutions.

As always, the code is available over on GitHub.

The post Using LangChain4j With Micronaut first appeared on Baeldung.
       

Viewing all articles
Browse latest Browse all 4616

Trending Articles



<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>