简体   繁体   中英

Reduce the execution time of jobs of sidekiq

I am currently working on an app which involves syncing of contacts on rails server. I am using redis server and sidekiq for performing contact syncing in the background. My database is mongodb and I am using mongoid gem as ORM. Workflow is a follows:

  1. Contacts on the phone are passed to the rails server through app and then on the rails server, it is queued in the redis server.
  2. Now cron job triggers sidekiq which connects to redis and completes the job.

One Job of sidekiq is as follows:

  1. it has array of contacts(size upto 3000).
  2. It has to process each of these contacts. By processing I mean make insert queries to DB.

Now the problem is that sidekiq takes insane amount of time to complete the job. On average it takes 50-70 sec to complete the job.

Following are the relevant files

sidekiq.yml

# Sample configuration file for Sidekiq.
# Options here can still be overridden by cmd line args.
#   sidekiq -C config.yml

:verbose: true
:concurrency:  5
:logfile: ./log/sidekiq.log
:pidfile: ./tmp/pids/sidekiq.pid
:queues:
  - [new_wall, 1]#6
  - [contact_wall, 1]#7
  - [email, 1]#5
  - [gcm_chat, 1]#5
  - [contact_address, 1]#7
  - [backlog_contact_address, 5]
  - [comment, 7]
  - [default, 5]

mongoid.yml

development:
  # Configure available database sessions. (required)
  sessions:
    # Defines the default session. (required)
    default:
          # Defines the name of the default database that Mongoid can connect to.
          # (required).
          database: "<%= ENV['DB_NAME']%>"
          # Provides the hosts the default session can connect to. Must be an array
          # of host:port pairs. (required)
          hosts:
            - "<%=ENV['MONGOD_URL']%>"
          #username: "<%= ENV['DB_USERNAME']%>"
          #password: "<%= ENV['DB_PASSWORD']%>"
          options:

            #pool: 12
        # Change the default write concern. (default = { w: 1 })
        # write:
        # w: 1

        # Change the default consistency model to primary, secondary.
        # 'secondary' will send reads to secondaries, 'primary' sends everything
        # to master. (default: primary)
        # read: secondary_preferred

        # How many times Moped should attempt to retry an operation after
        # failure. (default: The number of nodes in the cluster)
        # max_retries: 20

        # The time in seconds that Moped should wait before retrying an
        # operation on failure. (default: 0.25)
        # retry_interval: 0.25
  # Configure Mongoid specific options. (optional)
  options:
    # Includes the root model name in json serialization. (default: false)
    # include_root_in_json: false

    # Include the _type field in serializaion. (default: false)
    # include_type_for_serialization: false

    # Preload all models in development, needed when models use
    # inheritance. (default: false)
    # preload_models: false

    # Protect id and type from mass assignment. (default: true)
    # protect_sensitive_fields: true

    # Raise an error when performing a #find and the document is not found.
    # (default: true)
    # raise_not_found_error: true

    # Raise an error when defining a scope with the same name as an
    # existing method. (default: false)
    # scope_overwrite_exception: false

    # Use Active Support's time zone in conversions. (default: true)
    # use_activesupport_time_zone: true

    # Ensure all times are UTC in the app side. (default: false)
    # use_utc: false
test:
  sessions:
    default:
      database: db_test
      hosts:
        - localhost:27017
      options:
        read: primary
        # In the test environment we lower the retries and retry interval to
        # low amounts for fast failures.
        max_retries: 1
        retry_interval: 0


production:
  # Configure available database sessions. (required)
  sessions:
    # Defines the default session. (required)
    default:
      # Defines the name of the default database that Mongoid can connect to.
      # (required).
      database: "<%= ENV['DB_NAME']%>"
      # Provides the hosts the default session can connect to. Must be an array
      # of host:port pairs. (required)
      hosts:
        - "<%=ENV['MONGOD_URL']%>"
      username: "<%= ENV['DB_USERNAME']%>"
      password: "<%= ENV['DB_PASSWORD']%>"
      pool: 10
      options:

  # Configure Mongoid specific options. (optional)
  options:

Model.rb

def retry_save_contact_dump(c_dump_id)
      c_dump = ContactDump.where(_id: c_dump_id, status: ContactDump::CONTACT_DUMP_CONS[:ERROR]).first
      return false if c_dump.blank?
      user = User.where(_id: c_dump.user_id).first
      puts "retry_save_contact_dump"
      user.save_contacts_with_name(c_dump.contacts)
      c_dump.status = ContactDump::CONTACT_DUMP_CONS[:PROCESSED]
      c_dump.error_msg = ""
      c_dump.save
    rescue => e
      c_dump.status = ContactDump::CONTACT_DUMP_CONS[:CANTSYNC]
      c_dump.error_msg = e.message
      c_dump.save
   end


def save_contacts_with_name(c_array)
    m_num = Person.get_number_digest(self.mobile_number.to_s)
    c_array.each do |n|
      next if m_num == n["hash_mobile_number"]
      p = Person.where(h_m_num: n["hash_mobile_number"]).first_or_create
      save_friend(p) #if p.persisted?
      p.c_names.create(name: n["name"], user_id: self.id)
    end
  end

ContactDump.rb

class ContactDump
  include Mongoid::Document
  include Mongoid::Timestamps::Created
  include Mongoid::Timestamps::Updated

  field :contacts,   type: Array
  field :status,     type: Integer, default: 0
  field :user_id,    type: BSON::ObjectId
  field :error_msg,  type: String

  CONTACT_DUMP_CONS = {FRESH: 0,  PROCESSED: 1, ERROR: 2, CANTSYNC: 3}
end

How can I speed up the processing of jobs? I tried with permutation of increasing concurrency of sidekiq in sidekiq.yml and pool of mongoid.yml, but no help.

How do whatsApp and other messaging apps deal with contact syncing?

If some other info is required, please ask. Thanks.

EDIT: If not possible to answer this question, can anyone please suggest me other ways to sync the contacts on the rails server.

indexes to the rescue.

class ContactDump
  index({status: 1})
end

class Person
  index({h_m_num: 1})
end

Person might need more indexes depending on what your Person.get_number_digest does.

After adding indexes run rake db:mongoid:create_indexes

Also, do remove the puts , you don't need that on your worker and puts is hitting your performance badly, even when you can't see the output!

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM