好的,这就是我要尝试的。

我在服务器程序包中创建了一个名为CrawlServlet的类。

import java.io.IOException;

import javax.servlet.Filter;
import javax.servlet.FilterChain;
import javax.servlet.FilterConfig;
import javax.servlet.ServletException;
import javax.servlet.ServletRequest;
import javax.servlet.ServletResponse;
import javax.servlet.http.HttpServletRequest;


public class CrawlServlet implements Filter{



 @Override
 public void destroy() {
 // TODO Auto-generated method stub

 }


 @Override
 public void doFilter(ServletRequest request, ServletResponse response,
 FilterChain chain) throws IOException, ServletException {
 // TODO Auto-generated method stub
 HttpServletRequest httpRequest = (HttpServletRequest) request;
 String requestURI = httpRequest.getRequestURI();
      if ((requestURI != null) && (requestURI.contains("_escaped_fragment_"))) {
       System.out.println(requestURI);
     } else {
      try {
        // not an _escaped_fragment_ URL, so move up the chain of servlet (filters)
        chain.doFilter(request, response);
      } catch (ServletException e) {
        System.err.println("Servlet exception caught: " + e);
        e.printStackTrace();
      }
    }

 }


 @Override
 public void init(FilterConfig arg0) throws ServletException {
 // TODO Auto-generated method stub

 }
}


在lib / web.xml中,我有

  <filter>
     <filter-name>CrawlServlet</filter-name>
     <filter-class>CrawlServlet</filter-class>
  </filter>


  <filter-mapping>
     <filter-name>CrawlServlet</filter-name>
     <url-pattern>/*</url-pattern>
  </filter-mapping>


运行后,忽略了此错误:

Starting Jetty on port 8888
[WARN]
java.lang.ClassNotFoundException: CrawlServlet
at java.lang.ClassLoader.findClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
    at java.lang.ClassLoader.loadClass(Unknown Source)
.....
[WARN] FAILED CrawlServlet: javax.servlet.UnavailableException: CrawlServlet
javax.servlet.UnavailableException: CrawlServlet
....
[ERROR] 503 - GET /Myproject.html?gwt.codesvr=127.0.0.1:9997 (127.0.0.1) 1299 bytes
   Request headers
      Accept: text/html, application/xhtml+xml, */*
      Accept-Language: en-AU
      User-Agent: Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko
      Accept-Encoding: gzip, deflate
      Host: 127.0.0.1:8888
      If-Modified-Since: Wed, 16 Apr 2014 00:35:41 GMT
      Connection: keep-alive
   Response headers
      Cache-Control: must-revalidate,no-cache,no-store
      Content-Type: text/html;charset=ISO-8859-1
      Content-Length: 1299


怎么了?

你能解决这个问题吗?

最佳答案

该类的声明

<filter-class>CrawlServlet</filter-class>


必须完全合格。如果CrawlServlet包中包含server,则需要指定

<filter-class>server.CrawlServlet</filter-class>


或任何您想要的软件包。

10-08 00:47