7 Replies Latest reply on Sep 14, 2015 1:07 AM by Mike Bennett

    Using resque - reposted from RhoMobile discussions

    Mike Bennett

      I have an application that is deployed using RhoConnect.  It uses a Source Adapter to synchronize photos taken in the app.  All is working well unless a user takes several photos while off line, then the sync process fails.  I've been advised by a Zebra expert that using resque will solve this issue.

       

      I'm using RhoConnect 5.1.1, but the application has been ported from version 4.1.0

       

      I've tried to follow the documentation but have the following issue:

       

       

      I edit settings.yml and add

       

      :sources:

      Image:

      :queue: image_queue

       

      I then start a worker using

       

      QUEUE=* rake resque:work

       

      When a device is synchronized, I successfully get a job for the worker to process, but this job fails with the following details

       

      Worker B-P-MOBILE-APP2:10835

      Class Rhoconnect::SourceJob

      Arguments

      --- query

      ... 

      --- Image

      ...

      --- application

      ...

      --- 40001-05-8765-AB3

      ...

      ---

      ...

      Exception ArgumentError

      Error Unknown source

      /opt/rhoconnect/lib/ruby/gems/2.2.0/gems/rhoconnect-5.1.1/lib/rhoconnect/handler/query/engine.rb:12:in `initialize' /opt/rhoconnect/lib/ruby/gems/2.2.0/gems/rhoconnect-5.1.1/lib/rhoconnect/jobs/source_job.rb:15:in `new' /opt/rhoconnect/lib/ruby/gems/2.2.0/gems/rhoconnect-5.1.1/lib/rhoconnect/jobs/source_job.rb:15:in `perform'

       

      The code for the Image model is:

       

      class Image < Rhoconnect::Model::Base

        #include Rhoconnectrb::Resource

       

        @@photosFolder = "/opt/nginx/html/photos"

       

        def initialize(source)

          super(source)

        end

       

        def login

          # TODO: Login to your data source here if necessary

        end

       

        def query(params=nil)

       

          records = DbImage.where(["deviceid = ?", current_user.login])

          @result = Hash.new

          hash_index = 1

          if records != nil

              records.each do |record|

       

                  @result[hash_index.to_s] = record.as_json

       

                  @result[hash_index.to_s].delete("image_uri")

       

                  hash_index = hash_index + 1

       

                end

          end

       

          puts @result

       

          return @result

        end

       

        def create(create_hash)

          # Create a new record in your backend data source

          puts "create_hash: #{create_hash}"

       

          name = create_hash["image_uri"]

       

          # filename we saved in application.rb#store_blob method

          basename = create_hash["filename"]

       

          DbImage.create(create_hash)

       

          return create_hash["filename"]

        end

       

        def update(update_hash)

          # TODO: Update an existing record in your backend data source

        end

       

        def delete(delete_hash)

          record = DbImage.where("uniqueid = ?", delete_hash['uniqueid'])

          if record.length == 1

            record.delete(delete_hash['uniqueid'])

          end

        end

       

        def logoff

          # TODO: Logout from the data source if necessary

        end

       

        def store_blob(object,field_name,blob)

          root_path = "/"

          file_name = blob[:filename]

       

          # Save the file to the path

          from = blob[:tempfile].path

          to = "#{@@photosFolder}/#{blob[:filename].to_s}"

       

          # Create folder path if it doesn't exist

          if (!File.directory?(to))

              FileUtils.mkdir_p(File.dirname(to))

          end

       

          #Copy file

          FileUtils.cp(from, to)

       

          #Change file permissions so that it is readable by everyone

          File.chmod(0666, to)

       

          #Change group and owner to planetrails so that it can delete the file after processing

          ownerId = `id -u planetrails`

          groupId = `id -g planetrails`

          File.chown(ownerId.to_i, groupId.to_i, to)

       

          object['filename'] = blob[:filename].to_s

        end

      end

       

      What am I missing?

        • Re: Using resque - reposted from RhoMobile discussions
          Krishna Raja

          Mike,

           

          I tried the same thing (Except for DbImage) on my windows box. I didn't face any problem. It must be related to your environment. We need to put some logs in SourceJob to find more details

           

          When you create a queue, it should trigger async for both create and query. Is the create is going through fine?

           

          #Sources

          :sources:

            BlobImage:

              :poll_interval: 30

              :queue: image_queue

           

           

          :development:

            :licensefile: settings/license.key

            :redis: 127.0.0.1:6379

            :syncserver: http://localhost:9292

            :api_token: rhoconnect_api_token

           

           

          Image model is as below

           

          require 'active_record'

          require 'fileutils'

           

           

          class BlobImage < Rhoconnect::Model::Base

            def initialize(source)

              super(source)

            end

           

            def login

              # TODO: Login to your data source here if necessary

            end

           

            def query(params=nil)

              # TODO: Query your backend data source and assign the records

              # to a nested hash structure called @result. For example:

              # @result = {

              #   "1"=>{"name"=>"Acme", "industry"=>"Electronics"},

              #   "2"=>{"name"=>"Best", "industry"=>"Software"}

              # }

          #   raise Rhoconnect::Model::Exception.new("Please provide some code to read records from the backend data source")

            puts "inside query"

            return {}

            end

           

           

            def create(create_hash)

              # TODO: Create a new record in your backend data source

          #   raise "Please provide some code to create a single record in the backend data source using the create_hash"

            puts "create_hash: #{create_hash}"

           

           

              name = create_hash["image_uri"]

             

              # filename we saved in application.rb#store_blob method

              basename = create_hash["filename"]

           

           

           

           

            puts "basename is #{basename}"

           

              #DbImage.create(create_hash)

           

              return create_hash["filename"]

            end

           

            def update(update_hash)

              # TODO: Update an existing record in your backend data source

              raise "Please provide some code to update a single record in the backend data source using the update_hash"

            end

           

            def delete(delete_hash)

              # TODO: write some code here if applicable

              # be sure to have a hash key and value for "object"

              # for now, we'll say that its OK to not have a delete operation

              # raise "Please provide some code to delete a single object in the backend application using the object_id"

            end

           

            def logoff

              # TODO: Logout from the data source if necessary

            end

           

           

            def store_blob(object,field_name,blob)

              # TODO: Handle post requests for blobs here.

              # make sure you store the blob object somewhere permanently

              # raise "Please provide some code to handle blobs if you are using them."

            puts "fieldname: #{field_name}"

              puts "store_blob: #{blob}"

              root_path = "/"

              file_name = blob[:filename]

              puts "filename: #{blob[:filename]}"

           

           

              # Save the file to the path

              puts "class: #{blob[:tempfile].class}"

              from = blob[:tempfile].path

              to = "#{blob[:filename].to_s}"

           

           

              puts "Copying #{from} => #{to}"

           

           

              # Create folder path if it doesn't exist

              #if (!File.directory?(to))

              # FileUtils.mkdir_p(File.dirname(to))

              #end

             

              #Copy file

              FileUtils.cp(from, to)

             

              #Change file permissions so that it is readable by everyone

              File.chmod(0666, to)

             

              #Change group and owner to PlanetAdmin

              File.chown(500, 500, to)

             

             

              object['filename'] = blob[:filename].to_s

           

           

            end

          end

           

           

          Thanks

          Krishna

            • Re: Using resque - reposted from RhoMobile discussions
              Tahir Zamir

              Hi,

               

               

              Just for clarity, I am working with Mike on this issue and this is an update of where we are.

               

              We have Resque working now. 

               

              We have just one queue and we have just one model that is being synced by this queue.  We have 5 worker threads.

               

              The immediate impact was that the cpu usage on our dual core linux box shot up to 100%.  Prior to this the cpu usage would just fluctuate with peaks and troughs on a regular basis.

               

              The number of pending jobs in the resque queue has just steadily risen and is now around 3000.  The stats page states that over 500,000 jobs have been processed in the last couple of hours.

               

              The 500,000 processed jobs just doesn't correlate with what we understand is happening in terms of the volume of sync requests for the one model under question. 

               

              When the pending jobs queue which is listing 3000+ jobs is examined then these do appear to be coming from valid devices out in the field.  Why is this number steadily increasing?  5 worker threads should have been sufficient to easily handle a small number of sync requests every minute.

               

              Any insight from the community would be appreciated.

               

              Tahir