How to optimize Spring Boot Application Performance with just few steps

Published on 2024-05-08, by Javed Shaikh

Subscribe for new article
*No spam. unsubscribe at anytime

Spring Boot applications are known for their ease of development and rapid deployment, but as your application scales, performance optimization becomes crucial. This guide covers proven strategies to enhance your Spring Boot application's performance, complete with practical code examples and detailed explanations.

1. Database Optimization

Connection Pooling Configuration

Database connections are expensive resources. Proper connection pooling significantly reduces connection overhead and improves response times.

bash
1# application.yml 2spring: 3 datasource: 4 hikari: 5 maximum-pool-size: 20 6 minimum-idle: 5 7 idle-timeout: 300000 8 connection-timeout: 20000 9 max-lifetime: 1200000 10 leak-detection-threshold: 60000

Why it helps: HikariCP is the default connection pool in Spring Boot 2.x+. These settings ensure optimal connection reuse while preventing connection leaks and timeouts.

JPA Query Optimization

Lazy loading and query optimization prevent unnecessary database hits.

java
1@Entity 2public class User { 3 @Id 4 @GeneratedValue(strategy = GenerationType.IDENTITY) 5 private Long id; 6 7 private String name; 8 9 // Use LAZY loading for collections 10 @OneToMany(mappedBy = "user", fetch = FetchType.LAZY) 11 private List<Order> orders; 12} 13 14@Repository 15public interface UserRepository extends JpaRepository<User, Long> { 16 17 // Use @Query for optimized queries 18 @Query("SELECT u FROM User u WHERE u.name = :name") 19 List<User> findByNameOptimized(@Param("name") String name); 20 21 // Use JOIN FETCH to avoid N+1 problem 22 @Query("SELECT u FROM User u JOIN FETCH u.orders WHERE u.id = :id") 23 Optional<User> findByIdWithOrders(@Param("id") Long id); 24}

Performance impact: Lazy loading reduces initial query time by 30-70%, while JOIN FETCH eliminates N+1 queries that can cause exponential performance degradation.

2. Caching Strategies

Enable Spring Cache

Caching frequently accessed data dramatically reduces database load and improves response times.

java
1@Configuration 2@EnableCaching 3public class CacheConfig { 4 5 @Bean 6 public CacheManager cacheManager() { 7 CaffeineCacheManager cacheManager = new CaffeineCacheManager(); 8 cacheManager.setCaffeine(Caffeine.newBuilder() 9 .maximumSize(1000) 10 .expireAfterWrite(10, TimeUnit.MINUTES) 11 .recordStats()); 12 return cacheManager; 13 } 14} 15 16@Service 17public class UserService { 18 19 @Cacheable(value = "users", key = "#id") 20 public User getUserById(Long id) { 21 // This method will be cached 22 return userRepository.findById(id).orElse(null); 23 } 24 25 @CacheEvict(value = "users", key = "#user.id") 26 public User updateUser(User user) { 27 return userRepository.save(user); 28 } 29 30 @Cacheable(value = "userStats", key = "#userId") 31 public UserStats getUserStats(Long userId) { 32 // Expensive calculation cached for 10 minutes 33 return calculateUserStats(userId); 34 } 35}

Performance gain: Properly implemented caching can reduce response times by 80-95% for frequently accessed data and reduce database load significantly.

Redis for Distributed Caching

For multi-instance applications, use Redis for shared caching.

bash
1# application.yml 2spring: 3 redis: 4 host: localhost 5 port: 6379 6 timeout: 2000ms 7 lettuce: 8 pool: 9 max-active: 8 10 max-idle: 8 11 min-idle: 0

java
1@Configuration 2public class RedisConfig { 3 4 @Bean 5 public RedisTemplate<String, Object> redisTemplate(RedisConnectionFactory connectionFactory) { 6 RedisTemplate<String, Object> template = new RedisTemplate<>(); 7 template.setConnectionFactory(connectionFactory); 8 template.setKeySerializer(new StringRedisSerializer()); 9 template.setValueSerializer(new GenericJackson2JsonRedisSerializer()); 10 return template; 11 } 12}

3. Asynchronous Processing

Enable Async Methods

Move time-consuming operations to background threads to improve response times.

java
1@Configuration 2@EnableAsync 3public class AsyncConfig implements AsyncConfigurer { 4 5 @Override 6 @Bean(name = "taskExecutor") 7 public Executor getAsyncExecutor() { 8 ThreadPoolTaskExecutor executor = new ThreadPoolTaskExecutor(); 9 executor.setCorePoolSize(5); 10 executor.setMaxPoolSize(10); 11 executor.setQueueCapacity(100); 12 executor.setThreadNamePrefix("async-"); 13 executor.initialize(); 14 return executor; 15 } 16} 17 18@Service 19public class NotificationService { 20 21 @Async("taskExecutor") 22 public CompletableFuture<Void> sendEmailNotification(String email, String message) { 23 // Simulate email sending 24 try { 25 Thread.sleep(2000); // Email sending delay 26 log.info("Email sent to: {}", email); 27 } catch (InterruptedException e) { 28 Thread.currentThread().interrupt(); 29 } 30 return CompletableFuture.completedFuture(null); 31 } 32 33 @Async 34 public void processLargeDataSet(List<Data> dataList) { 35 // Process large dataset asynchronously 36 dataList.parallelStream() 37 .forEach(this::processData); 38 } 39}

Impact: Async processing can improve perceived response times by 60-90% for operations involving external services or heavy computations.

4. JVM Optimization

Memory Configuration

Proper JVM tuning prevents garbage collection bottlenecks.

bash
1# Production JVM settings 2java -jar myapp.jar \ 3 -Xms2g \ 4 -Xmx4g \ 5 -XX:+UseG1GC \ 6 -XX:MaxGCPauseMillis=200 \ 7 -XX:+HeapDumpOnOutOfMemoryError \ 8 -XX:HeapDumpPath=/logs/heapdump.hprof

text
1# application.yml 2server: 3 # Optimize embedded Tomcat 4 tomcat: 5 threads: 6 max: 200 7 min-spare: 10 8 connection-timeout: 20000 9 max-connections: 8192 10 accept-count: 100 11 12spring: 13 jpa: 14 hibernate: 15 # Disable in production 16 ddl-auto: none 17 # Optimize SQL logging 18 show-sql: false 19 properties: 20 hibernate: 21 # Enable batch processing 22 jdbc.batch_size: 20 23 order_inserts: true 24 order_updates: true 25 # Second-level cache 26 cache.use_second_level_cache: true 27 cache.region.factory_class: org.hibernate.cache.jcache.JCacheRegionFactory

5. HTTP Response Optimization

Enable Compression

Reduce payload size with GZIP compression.

text
1# application.yml 2server: 3 compression: 4 enabled: true 5 mime-types: text/html,text/xml,text/plain,text/css,text/javascript,application/javascript,application/json 6 min-response-size: 1024

java
1@RestController 2public class ApiController { 3 4 @GetMapping("/users/{id}") 5 public ResponseEntity<User> getUser(@PathVariable Long id) { 6 User user = userService.getUserById(id); 7 8 return ResponseEntity.ok() 9 .cacheControl(CacheControl.maxAge(30, TimeUnit.MINUTES)) 10 .eTag(String.valueOf(user.getVersion())) 11 .body(user); 12 } 13 14 @GetMapping("/static-data") 15 public ResponseEntity<StaticData> getStaticData() { 16 StaticData data = dataService.getStaticData(); 17 18 return ResponseEntity.ok() 19 .cacheControl(CacheControl.maxAge(1, TimeUnit.HOURS) 20 .cachePublic()) 21 .body(data); 22 } 23}

Bandwidth savings: Compression typically reduces payload size by 60-80%, significantly improving load times for clients with slower connections.

6. Monitoring and Profiling

Enable Actuator for Monitoring

text
1# application.yml 2management: 3 endpoints: 4 web: 5 exposure: 6 include: health,metrics,prometheus 7 endpoint: 8 health: 9 show-details: always 10 metrics: 11 export: 12 prometheus: 13 enabled: true

Custom Performance Metrics

java
1@Component 2public class PerformanceMetrics { 3 4 private final MeterRegistry meterRegistry; 5 private final Timer.Sample sample; 6 7 public PerformanceMetrics(MeterRegistry meterRegistry) { 8 this.meterRegistry = meterRegistry; 9 } 10 11 @EventListener 12 public void handleRequest(RequestEvent event) { 13 Timer.Sample sample = Timer.start(meterRegistry); 14 sample.stop(Timer.builder("request.duration") 15 .tag("endpoint", event.getEndpoint()) 16 .register(meterRegistry)); 17 } 18}

Best Practices

  1. Always measure before optimizing - Use profiling tools to identify actual bottlenecks
  2. Implement caching strategically - Cache expensive operations, not everything
  3. Optimize database queries - Use appropriate fetch strategies and avoid N+1 problems
  4. Configure connection pools properly - Match pool sizes to your application's needs
  5. Use async processing - For operations that don't need immediate responses
  6. Monitor continuously - Set up proper monitoring and alerting
  7. Test under load - Performance testing should simulate real-world conditions

Conclusion

Performance optimization is an iterative process that requires careful measurement and testing. The first step for optimization is understanding your application's specific usage patterns and optimizing accordingly. These above techniques provide a solid foundation, but the specific implementation may require additional customization based on the application's unique requirements.

About the Author

I am a Backend System Engineer at a credit card company, specializing in C/C++ and assembler on IBM's TPF OS. I have a passion for web development and enjoy working with Node.js and Python in my free time.

Connect with author

Related articles ...

Java Records vs Traditional Classes: When and how to use Java Records

Java Records significantly reduce boilerplate code. With records, we don’t need to write getters, toString(), equals(), or hashCode() methods.

2024-10-09

Spring Boot Actuator for Performance Monitoring

Spring Boot Actuator is a crucial tool for monitoring and managing Spring Boot applications in production. It provides ready-to-use endpoints that help track application health, metrics, and performance.

2024-02-05

Building a TCP Server with Reactor Netty and Spring Boot: A Step-by-Step Guide

When it comes to building high-performance, scalable TCP servers, the combination of Reactor Netty and Spring Boot offers a powerful solution. Reactor Netty provides a non-blocking and reactive approach to network programming, while Spring Boot simplifies the setup and configuration process

2024-01-02

Optimizing Memory Configuration in Java Applications

Understanding how to optimize memory configuration can dramatically improve your application's throughput, reduce latency, and prevent dreaded OutOfMemoryError exceptions

2024-07-17