为什么Lettuce的连接操作比Jedis需要更多的时间?

如何解决为什么Lettuce的连接操作比Jedis需要更多的时间?

连接到本地Redis,生菜需要近5000毫秒,而Jedis仅需要30毫秒。 我指的是示例ConnectToRedis

我使用具有lombok依赖关系的默认spring-boot-starter:

我的代码:

@Component
@Slf4j
class LettuceRunner implements CommandLineRunner {
    @Override
    public void run(String... args) throws Exception {

        StopWatch watch = new StopWatch();
        RedisClient redisClient = RedisClient.create("redis://localhost:6379");

        watch.start();
        StatefulRedisConnection<String,String> connection = redisClient.connect();
        watch.stop();

        log.info("lettuce : {} ms",watch.getLastTaskTimeMillis());

        connection.close();
        redisClient.shutdown();
    }
}

@Component
@Slf4j
class JedisRunner implements CommandLineRunner {
    @Override
    public void run(String... args) throws Exception {

        StopWatch watch = new StopWatch();
        watch.start();
        Jedis jedis = new Jedis("localhost");
        jedis.get("redis_key");
        watch.stop();
        log.info("jedis : {} ms",watch.getLastTaskInfo().getTimeMillis());
    }
}

其结果是:

2020-08-14 17:02:28.236 INFO 21760 --- [main] com.example.demo.JedisRunner:jedis:27毫秒 2020-08-14 17:02:33.318 INFO 21760 --- [main] com.example.demo.LettuceRunner:生菜:4815毫秒

解决方法

因为生菜使用Netty,并且花费大量时间在Netty中启动事物。

检查日志,如您所见,所花费的大部分时间都在io.netty软件包中:

2020-08-15 00:54:06.030 DEBUG 728 --- [           main] i.l.c.r.DefaultEventLoopGroupProvider    : Creating executor io.netty.util.concurrent.DefaultEventExecutorGroup
2020-08-15 00:54:06.031 DEBUG 728 --- [           main] io.lettuce.core.RedisClient              : Trying to get a Redis connection for: RedisURI [host='localhost',port=6379]
2020-08-15 00:54:06.120 DEBUG 728 --- [           main] io.lettuce.core.EpollProvider            : Starting without optional epoll library
2020-08-15 00:54:06.122 DEBUG 728 --- [           main] io.lettuce.core.KqueueProvider           : Starting without optional kqueue library
2020-08-15 00:54:06.123 DEBUG 728 --- [           main] i.l.c.r.DefaultEventLoopGroupProvider    : Allocating executor io.netty.channel.nio.NioEventLoopGroup
2020-08-15 00:54:06.123 DEBUG 728 --- [           main] i.l.c.r.DefaultEventLoopGroupProvider    : Creating executor io.netty.channel.nio.NioEventLoopGroup
2020-08-15 00:54:06.124 DEBUG 728 --- [           main] i.n.channel.MultithreadEventLoopGroup    : -Dio.netty.eventLoopThreads: 12
2020-08-15 00:54:06.129 DEBUG 728 --- [           main] io.netty.channel.nio.NioEventLoop        : -Dio.netty.noKeySetOptimization: false
2020-08-15 00:54:06.129 DEBUG 728 --- [           main] io.netty.channel.nio.NioEventLoop        : -Dio.netty.selectorAutoRebuildThreshold: 512
2020-08-15 00:54:06.421 DEBUG 728 --- [           main] i.l.c.r.DefaultEventLoopGroupProvider    : Adding reference to io.netty.channel.nio.NioEventLoopGroup@7c59cf66,existing ref count 0
2020-08-15 00:54:06.431 DEBUG 728 --- [           main] io.lettuce.core.RedisClient              : Resolved SocketAddress localhost:6379 using RedisURI [host='localhost',port=6379]
2020-08-15 00:54:06.432 DEBUG 728 --- [           main] io.lettuce.core.RedisClient              : Connecting to Redis at localhost:6379
2020-08-15 00:54:06.435 DEBUG 728 --- [           main] io.netty.channel.DefaultChannelId        : -Dio.netty.processId: 728 (auto-detected)
2020-08-15 00:54:06.437 DEBUG 728 --- [           main] io.netty.util.NetUtil                    : -Djava.net.preferIPv4Stack: false
2020-08-15 00:54:06.437 DEBUG 728 --- [           main] io.netty.util.NetUtil                    : -Djava.net.preferIPv6Addresses: false
2020-08-15 00:54:06.659 DEBUG 728 --- [           main] io.netty.util.NetUtil                    : Loopback interface: lo (Software Loopback Interface 1,127.0.0.1)
2020-08-15 00:54:06.660 DEBUG 728 --- [           main] io.netty.util.NetUtil                    : Failed to get SOMAXCONN from sysctl and file \proc\sys\net\core\somaxconn. Default: 200
2020-08-15 00:54:06.898 DEBUG 728 --- [           main] io.netty.channel.DefaultChannelId        : -Dio.netty.machineId: 00:50:56:ff:fe:c0:00:08 (auto-detected)
2020-08-15 00:54:06.911 DEBUG 728 --- [           main] io.netty.buffer.ByteBufUtil              : -Dio.netty.allocator.type: pooled
2020-08-15 00:54:06.912 DEBUG 728 --- [           main] io.netty.buffer.ByteBufUtil              : -Dio.netty.threadLocalDirectBufferSize: 0
2020-08-15 00:54:06.912 DEBUG 728 --- [           main] io.netty.buffer.ByteBufUtil              : -Dio.netty.maxThreadLocalCharBufferSize: 16384
2020-08-15 00:54:06.928 DEBUG 728 --- [ioEventLoop-8-1] io.netty.util.Recycler                   : -Dio.netty.recycler.maxCapacityPerThread: 4096
2020-08-15 00:54:06.928 DEBUG 728 --- [ioEventLoop-8-1] io.netty.util.Recycler                   : -Dio.netty.recycler.maxSharedCapacityFactor: 2
2020-08-15 00:54:06.928 DEBUG 728 --- [ioEventLoop-8-1] io.netty.util.Recycler                   : -Dio.netty.recycler.linkCapacity: 16
2020-08-15 00:54:06.928 DEBUG 728 --- [ioEventLoop-8-1] io.netty.util.Recycler                   : -Dio.netty.recycler.ratio: 8
2020-08-15 00:54:06.928 DEBUG 728 --- [ioEventLoop-8-1] io.netty.util.Recycler                   : -Dio.netty.recycler.delayedQueue.ratio: 8
2020-08-15 00:54:06.933 DEBUG 728 --- [ioEventLoop-8-1] io.netty.buffer.AbstractByteBuf          : -Dio.netty.buffer.checkAccessible: true
2020-08-15 00:54:06.933 DEBUG 728 --- [ioEventLoop-8-1] io.netty.buffer.AbstractByteBuf          : -Dio.netty.buffer.checkBounds: true
2020-08-15 00:54:06.933 DEBUG 728 --- [ioEventLoop-8-1] i.n.util.ResourceLeakDetectorFactory     : Loaded default ResourceLeakDetector: io.netty.util.ResourceLeakDetector@20e9fc6c
2020-08-15 00:54:06.950 DEBUG 728 --- [ioEventLoop-8-1] io.lettuce.core.protocol.CommandHandler  : [channel=0x1ced470d,[id: 0x7bd077d9] (inactive),chid=0x1] channelRegistered()
2020-08-15 00:54:06.953 DEBUG 728 --- [ioEventLoop-8-1] io.lettuce.core.protocol.CommandHandler  : [channel=0x1ced470d,/127.0.0.1:2106 -> localhost/127.0.0.1:6379,chid=0x1] channelActive()
2020-08-15 00:54:06.954 DEBUG 728 --- [ioEventLoop-8-1] i.lettuce.core.protocol.DefaultEndpoint  : [channel=0x1ced470d,epid=0x1] activateEndpointAndExecuteBufferedCommands 0 command(s) buffered
2020-08-15 00:54:06.954 DEBUG 728 --- [ioEventLoop-8-1] i.lettuce.core.protocol.DefaultEndpoint  : [channel=0x1ced470d,epid=0x1] activating endpoint
2020-08-15 00:54:06.954 DEBUG 728 --- [ioEventLoop-8-1] i.lettuce.core.protocol.DefaultEndpoint  : [channel=0x1ced470d,epid=0x1] flushCommands()
2020-08-15 00:54:06.954 DEBUG 728 --- [ioEventLoop-8-1] i.lettuce.core.protocol.DefaultEndpoint  : [channel=0x1ced470d,epid=0x1] flushCommands() Flushing 0 commands
2020-08-15 00:54:06.954 DEBUG 728 --- [ioEventLoop-8-1] i.l.core.protocol.ConnectionWatchdog     : [channel=0x1ced470d,last known addr=localhost/127.0.0.1:6379] channelActive()
2020-08-15 00:54:06.954 DEBUG 728 --- [ioEventLoop-8-1] io.lettuce.core.protocol.CommandHandler  : [channel=0x1ced470d,chid=0x1] channelActive() done
2020-08-15 00:54:06.955 DEBUG 728 --- [ioEventLoop-8-1] io.lettuce.core.RedisClient              : Connecting to Redis at localhost:6379: Success
2020-08-15 00:54:06.956  INFO 728 --- [           main] c.h.s.c.c.CacheStudyApplicationTests     : lettuce : 925 ms
2020-08-15 00:54:06.956 DEBUG 728 --- [           main] io.lettuce.core.RedisChannelHandler      : close()
2020-08-15 00:54:06.956 DEBUG 728 --- [           main] io.lettuce.core.RedisChannelHandler      : closeAsync()
2020-08-15 00:54:06.956 DEBUG 728 --- [           main] i.lettuce.core.protocol.DefaultEndpoint  : [channel=0x1ced470d,epid=0x1] closeAsync()
2020-08-15 00:54:06.957 DEBUG 728 --- [ioEventLoop-8-1] i.l.core.protocol.ConnectionWatchdog     : [channel=0x1ced470d,last known addr=localhost/127.0.0.1:6379] userEventTriggered(ctx,io.lettuce.core.ConnectionEvents$Activated@1cda757f)
2020-08-15 00:54:06.958 DEBUG 728 --- [           main] io.lettuce.core.RedisClient              : Initiate shutdown (0,2,SECONDS)
2020-08-15 00:54:06.959 DEBUG 728 --- [ioEventLoop-8-1] io.lettuce.core.protocol.CommandHandler  : [channel=0x1ced470d,chid=0x1] channelInactive()
2020-08-15 00:54:06.959 DEBUG 728 --- [ioEventLoop-8-1] i.lettuce.core.protocol.DefaultEndpoint  : [channel=0x1ced470d,epid=0x1] deactivating endpoint handler
2020-08-15 00:54:06.960 DEBUG 728 --- [ioEventLoop-8-1] io.lettuce.core.protocol.CommandHandler  : [channel=0x1ced470d,chid=0x1] channelInactive() done
2020-08-15 00:54:06.960 DEBUG 728 --- [ioEventLoop-8-1] i.l.core.protocol.ConnectionWatchdog     : [channel=0x1ced470d,last known addr=localhost/127.0.0.1:6379] channelInactive()
2020-08-15 00:54:06.960 DEBUG 728 --- [ioEventLoop-8-1] i.l.core.protocol.ConnectionWatchdog     : [channel=0x1ced470d,last known addr=localhost/127.0.0.1:6379] Reconnect scheduling disabled
2020-08-15 00:54:06.960 DEBUG 728 --- [ioEventLoop-8-1] io.lettuce.core.protocol.CommandHandler  : [channel=0x1ced470d,chid=0x1] channelUnregistered()
2020-08-15 00:54:06.961 DEBUG 728 --- [           main] i.l.c.resource.DefaultClientResources    : Initiate shutdown (0,SECONDS)
2020-08-15 00:54:06.963 DEBUG 728 --- [           main] i.l.c.r.DefaultEventLoopGroupProvider    : Initiate shutdown (0,SECONDS)
2020-08-15 00:54:06.963 DEBUG 728 --- [           main] i.l.c.r.DefaultEventLoopGroupProvider    : Release executor io.netty.channel.nio.NioEventLoopGroup@7c59cf66
2020-08-15 00:54:06.965 DEBUG 728 --- [ioEventLoop-8-1] io.netty.buffer.PoolThreadCache          : Freed 1 thread-local buffer(s) from thread: lettuce-nioEventLoop-8-1

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


使用本地python环境可以成功执行 import pandas as pd import matplotlib.pyplot as plt # 设置字体 plt.rcParams[&#39;font.sans-serif&#39;] = [&#39;SimHei&#39;] # 能正确显示负号 p
错误1:Request method ‘DELETE‘ not supported 错误还原:controller层有一个接口,访问该接口时报错:Request method ‘DELETE‘ not supported 错误原因:没有接收到前端传入的参数,修改为如下 参考 错误2:cannot r
错误1:启动docker镜像时报错:Error response from daemon: driver failed programming external connectivity on endpoint quirky_allen 解决方法:重启docker -&gt; systemctl r
错误1:private field ‘xxx‘ is never assigned 按Altʾnter快捷键,选择第2项 参考:https://blog.csdn.net/shi_hong_fei_hei/article/details/88814070 错误2:启动时报错,不能找到主启动类 #
报错如下,通过源不能下载,最后警告pip需升级版本 Requirement already satisfied: pip in c:\users\ychen\appdata\local\programs\python\python310\lib\site-packages (22.0.4) Coll
错误1:maven打包报错 错误还原:使用maven打包项目时报错如下 [ERROR] Failed to execute goal org.apache.maven.plugins:maven-resources-plugin:3.2.0:resources (default-resources)
错误1:服务调用时报错 服务消费者模块assess通过openFeign调用服务提供者模块hires 如下为服务提供者模块hires的控制层接口 @RestController @RequestMapping(&quot;/hires&quot;) public class FeignControl
错误1:运行项目后报如下错误 解决方案 报错2:Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.1:compile (default-compile) on project sb 解决方案:在pom.
参考 错误原因 过滤器或拦截器在生效时,redisTemplate还没有注入 解决方案:在注入容器时就生效 @Component //项目运行时就注入Spring容器 public class RedisBean { @Resource private RedisTemplate&lt;String
使用vite构建项目报错 C:\Users\ychen\work&gt;npm init @vitejs/app @vitejs/create-app is deprecated, use npm init vite instead C:\Users\ychen\AppData\Local\npm-
参考1 参考2 解决方案 # 点击安装源 协议选择 http:// 路径填写 mirrors.aliyun.com/centos/8.3.2011/BaseOS/x86_64/os URL类型 软件库URL 其他路径 # 版本 7 mirrors.aliyun.com/centos/7/os/x86
报错1 [root@slave1 data_mocker]# kafka-console-consumer.sh --bootstrap-server slave1:9092 --topic topic_db [2023-12-19 18:31:12,770] WARN [Consumer clie
错误1 # 重写数据 hive (edu)&gt; insert overwrite table dwd_trade_cart_add_inc &gt; select data.id, &gt; data.user_id, &gt; data.course_id, &gt; date_format(
错误1 hive (edu)&gt; insert into huanhuan values(1,&#39;haoge&#39;); Query ID = root_20240110071417_fe1517ad-3607-41f4-bdcf-d00b98ac443e Total jobs = 1
报错1:执行到如下就不执行了,没有显示Successfully registered new MBean. [root@slave1 bin]# /usr/local/software/flume-1.9.0/bin/flume-ng agent -n a1 -c /usr/local/softwa
虚拟及没有启动任何服务器查看jps会显示jps,如果没有显示任何东西 [root@slave2 ~]# jps 9647 Jps 解决方案 # 进入/tmp查看 [root@slave1 dfs]# cd /tmp [root@slave1 tmp]# ll 总用量 48 drwxr-xr-x. 2
报错1 hive&gt; show databases; OK Failed with exception java.io.IOException:java.lang.RuntimeException: Error in configuring object Time taken: 0.474 se
报错1 [root@localhost ~]# vim -bash: vim: 未找到命令 安装vim yum -y install vim* # 查看是否安装成功 [root@hadoop01 hadoop]# rpm -qa |grep vim vim-X11-7.4.629-8.el7_9.x
修改hadoop配置 vi /usr/local/software/hadoop-2.9.2/etc/hadoop/yarn-site.xml # 添加如下 &lt;configuration&gt; &lt;property&gt; &lt;name&gt;yarn.nodemanager.res