简体   繁体   中英

How to connect FSCrawler REST with docker-compose

I've successfully indexed a pdf using FSCrawler but I'm not able to connect to the REST client for FSCrawler to make a pipeline to elasticsearch. This is my command in docker-compose:

command: fscrawler fscrawler_rest

I'm able to query elasticsearch with the index of my FSCrawler job name and retrieve the results. Then when I add the --rest flag to my docker-compose command I successfully start the REST client (albeit with a warning I don't understand):

WARN  [o.g.j.i.i.Providers] A provider fr.pilato.elasticsearch.crawler.fs.rest.UploadApi registered in SERVER runtime does not implement any provider interfaces applicable in the SERVER runtime. 
      Due to constraint configuration problems the provider fr.pilato.elasticsearch.crawler.fs.rest.UploadApi will be ignored.
INFO  [f.p.e.c.f.r.RestServer] FS crawler Rest service started on [http://127.0.0.1:8080/fscrawler]

Then when I try curl with or without the trailing slash: curl -XGET "127.0.0.1:8080/fscrawler/" I get curl: (7) Failed to connect to 127.0.0.1 port 8080: Connection refused

new docker-compose command for reference:

command: fscrawler fscrawler_rest --loop 0 --rest debug

I can't seem to debug it well as docker-compose doesn't allow CLI commands while containers are running, but I don't understand why I can still reach my job index in elasticsearch with http://localhost:9200/fscrawler_rest .

FSCrawler is working with elasticsearch but the REST service doesn't seem to be working. Has anyone been successful using the FSCrawler REST API?

EDIT:

version: '3.6'

services:
  postgres:
    image: "postgres:12.1"
    env_file:
      - '.env'
    ports:
      - '127.0.0.1:5432:5432'
    restart: "${DOCKER_RESTART_POLICY:-unless-stopped}"
    stop_grace_period: "${DOCKER_STOP_GRACE_PERIOD:-3s}"
    volumes:
      - postgres:/var/lib/postgresql/data
    networks: 
      - esnet

  elasticsearch:
    image: docker.elastic.co/elasticsearch/elasticsearch:7.8.0
    # build: ./es
    container_name: elasticsearch
    env_file:
      - ".env"
    depends_on:
      - "postgres"
    volumes:
      - esdata:/usr/share/elasticsearch/data
    environment:
      - node.name=elasticsearch
      - bootstrap.memory_lock=true
      - "ES_JAVA_OPTS=-Xms512m -Xmx512m"
      - discovery.type=single-node
      - network.host=0.0.0.0
      - network.publish_host=0.0.0.0
      - http.cors.enabled=true
      - http.cors.allow-origin=*
      - http.host=0.0.0.0
      - transport.host=0.0.0.0
    ulimits:
      memlock:
        soft: -1
        hard: -1
    ports:
      - 9200:9200
      - 9300:9300
    networks:
      - esnet

  fscrawler:
    # I have taken this docker image and updated to 2.7 snapshot: toto1310/fscrawler
    build:
      context: ${PWD}
      dockerfile: Dockerfile-toto
    container_name: fscrawler
    depends_on:
      - elasticsearch
    restart: always
    volumes:
      - ${PWD}/config:/root/.fscrawler
      - ${PWD}/data:/tmp/es
    networks: 
      - esnet
    environment:
      - FS_URL=/tmp/es
      - ELASTICSEARCH_URL=http://elasticsearch:9200
      - ELASTICSEARCH_INDEX=fscrawler_rest
    command: fscrawler fscrawler_rest --loop 0 --rest debug

volumes:
  postgres:
  esdata:
    driver: local

networks:
  esnet:

Adding ports to fscrawler

ports:
  - 8080:8080

gives empty response unless you change the settings.yaml rest url:

rest:
  url: “http://fscrawler:8080”

To reach the docker container named fscrawler .

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM