微信公众号搜"智元新知"关注
微信扫一扫可直接关注哦!

为什么 SparkJava 不处理同一连接上的第二个请求?

如何解决为什么 SparkJava 不处理同一连接上的第二个请求?

我使用 SparkJava 编写了一个带有 REST-API 的小型服务器。我尝试使用 Apache Httpclient 查询 REST-API。使用这个客户端,我打开一个连接并向服务器发送第一个请求并接收响应。然后我重用相同的连接向服务器发送第二个请求。请求被传输,但服务器不处理它。有谁知道,我做错了什么?

这是一个最小的工作示例:

Maven 依赖:

        <dependency>
            <groupId>com.sparkjava</groupId>
            <artifactId>spark-core</artifactId>
            <version>2.9.3</version>
        </dependency>
        <dependency>
            <groupId>org.apache.httpcomponents.client5</groupId>
            <artifactId>httpclient5</artifactId>
            <version>5.0.3</version>
        </dependency>

服务器类:

package minimal;

import spark.Spark;

public class Server {

  public static void main(String[] args) {
    Spark.post("/a",(req,resp) -> {
          resp.status(204);
          return "";
        });
    Spark.post("/b",resp) -> {
          resp.status(204);
          return "";
        });
    Spark.before((req,res) -> {
          System.out.println("Before: Request from " + req.ip() + " received " + req.pathInfo());
        });
    Spark.after((req,res) -> {
          System.out.println("After: Request from " + req.ip() + " received " + req.pathInfo());
        });
  }
}

客户端类:

package minimal;

import java.io.IOException;

import org.apache.hc.client5.http.classic.methods.HttpPost;
import org.apache.hc.client5.http.impl.classic.CloseableHttpClient;
import org.apache.hc.client5.http.impl.classic.CloseableHttpResponse;
import org.apache.hc.client5.http.impl.classic.HttpClients;

public class Client {

  public static void main(String[] args) throws IOException {
    try (CloseableHttpClient httpclient = HttpClients.createDefault()) {
      HttpPost httpPost1 = new HttpPost("http://localhost:4567/a");
      try (CloseableHttpResponse response1 = httpclient.execute(httpPost1)) {
        System.out.println(response1.getCode() + " " + response1.getReasonPhrase());
      }

      HttpPost httpPost2 = new HttpPost("http://localhost:4567/b");
      try (CloseableHttpResponse response2 = httpclient.execute(httpPost2)) {
        System.out.println(response2.getCode() + " " + response2.getReasonPhrase());
      }
    }
  }
}

控制台上的服务器输出

Before: Request from 127.0.0.1 received /a
After: Request from 127.0.0.1 received /a

这里是 tcpdump 的缩短输出

14:52:15.210468 IP localhost.44020 > localhost.4567:
POST /a HTTP/1.1
Accept-Encoding: gzip,x-gzip,deflate
Host: localhost:4567
Connection: keep-alive
User-Agent: Apache-HttpClient/5.0.3 (Java/1.8.0_282)

14:52:15.271563 IP localhost.4567 > localhost.44020:
HTTP/1.1 204 No Content
Date: Tue,27 Apr 2021 12:52:15 GMT
Content-Type: text/html;charset=utf-8
Server: Jetty(9.4.26.v20200117)

14:52:15.277376 IP localhost.44020 > localhost.4567:
POST /b HTTP/1.1
Accept-Encoding: gzip,deflate
Host: localhost:4567
Connection: keep-alive
User-Agent: Apache-HttpClient/5.0.3 (Java/1.8.0_282)

此后不再记录服务器的响应。

解决方法

这是客户端示例,请尝试一下,看看它是否适合您。 我对其进行了测试,效果很好。

import org.apache.http.client.methods.CloseableHttpResponse;
import org.apache.http.client.methods.HttpPost;
import org.apache.http.impl.client.CloseableHttpClient;
import org.apache.http.impl.client.HttpClients;

import java.io.IOException;

public class T1 {

   static void runPost(CloseableHttpClient c,String s)
    {
            HttpPost httpPost1 = new HttpPost(s);
            try(CloseableHttpResponse response1 = c.execute(httpPost1)) {
                System.out.println(Thread.currentThread().getName() + ": " +
                         response1.getStatusLine().getStatusCode() + " " +
                         response1.getStatusLine().getReasonPhrase());
            } catch (Exception e) {
                e.printStackTrace();
            }
    }

    public static void main(String[] args) throws IOException {

        try(CloseableHttpClient httpclient = HttpClients.createDefault()) {
            T1.runPost(httpclient,"http://localhost:4567/a");
            T1.runPost(httpclient,"http://localhost:4567/b");
        }
        System.exit(0);
    }
}
,

SparkJava 服务器缺少处理的原因是我在项目中有以下额外的 maven 依赖:

        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.4.0.7.1.1.0-565</version>
        </dependency>

删除此依赖项后,SparkJava 服务器按预期工作。

版权声明:本文内容由互联网用户自发贡献,该文观点与技术仅代表作者本人。本站仅提供信息存储空间服务,不拥有所有权,不承担相关法律责任。如发现本站有涉嫌侵权/违法违规的内容, 请发送邮件至 dio@foxmail.com 举报,一经查实,本站将立刻删除。

相关推荐


Selenium Web驱动程序和Java。元素在(x,y)点处不可单击。其他元素将获得点击?
Python-如何使用点“。” 访问字典成员?
Java 字符串是不可变的。到底是什么意思?
Java中的“ final”关键字如何工作?(我仍然可以修改对象。)
“loop:”在Java代码中。这是什么,为什么要编译?
java.lang.ClassNotFoundException:sun.jdbc.odbc.JdbcOdbcDriver发生异常。为什么?
这是用Java进行XML解析的最佳库。
Java的PriorityQueue的内置迭代器不会以任何特定顺序遍历数据结构。为什么?
如何在Java中聆听按键时移动图像。
Java“Program to an interface”。这是什么意思?