I have an application that is deployed using RhoConnect. It uses a Source Adapter to synchronize photos taken in the app. All is working well unless a user takes several photos while off line, then the sync process fails. I've been advised by a Zebra expert that using resque will solve this issue.
I'm using RhoConnect 5.1.1, but the application has been ported from version 4.1.0
I've tried to follow the documentation but have the following issue:
I edit settings.yml and add
:sources:
Image:
:queue: image_queue
I then start a worker using
QUEUE=* rake resque:work
When a device is synchronized, I successfully get a job for the worker to process, but this job fails with the following details
Worker B-P-MOBILE-APP2:10835
Class Rhoconnect::SourceJob
Arguments
--- query
...
--- Image
...
--- application
...
--- 40001-05-8765-AB3
...
---
...
Exception ArgumentError
Error Unknown source
/opt/rhoconnect/lib/ruby/gems/2.2.0/gems/rhoconnect-5.1.1/lib/rhoconnect/handler/query/engine.rb:12:in `initialize' /opt/rhoconnect/lib/ruby/gems/2.2.0/gems/rhoconnect-5.1.1/lib/rhoconnect/jobs/source_job.rb:15:in `new' /opt/rhoconnect/lib/ruby/gems/2.2.0/gems/rhoconnect-5.1.1/lib/rhoconnect/jobs/source_job.rb:15:in `perform'
The code for the Image model is:
class Image Rhoconnect::Model::Base
#include Rhoconnectrb::Resource
@@photosFolder = "/opt/nginx/html/photos"
def initialize(source)
super(source)
end
def login
# TODO: Login to your data source here if necessary
end
def query(params=nil)
records = DbImage.where(["deviceid = ?", current_user.login])
@result = Hash.new
hash_index = 1
if records != nil
records.each do |record|
@result[hash_index.to_s] = record.as_json
@result[hash_index.to_s].delete("image_uri")
hash_index = hash_index + 1
end
end
puts @result
return @result
end
def create(create_hash)
# Create a new record in your backend data source
puts "create_hash: #{create_hash}"
name = create_hash["image_uri"]
# filename we saved in application.rb#store_blob method
basename = create_hash["filename"]
DbImage.create(create_hash)
return create_hash["filename"]
end
def update(update_hash)
# TODO: Update an existing record in your backend data source
end
def delete(delete_hash)
record = DbImage.where("uniqueid = ?", delete_hash['uniqueid'])
if record.length == 1
record.delete(delete_hash['uniqueid'])
end
end
def logoff
# TODO: Logout from the data source if necessary
end
def store_blob(object,field_name,blob)
root_path = "/"
file_name = blob[:filename]
# Save the file to the path
from = blob[:tempfile].path
to = "#{@@photosFolder}/#{blob[:filename].to_s}"
# Create folder path if it doesn't exist
if (!File.directory?(to))
FileUtils.mkdir_p(File.dirname(to))
end
#Copy file
FileUtils.cp(from, to)
#Change file permissions so that it is readable by everyone
File.chmod(0666, to)
#Change group and owner to planetrails so that it can delete the file after processing
ownerId = `id -u planetrails`
groupId = `id -g planetrails`
File.chown(ownerId.to_i, groupId.to_i, to)
object['filename'] = blob[:filename].to_s
end
end
What am I missing?
1 Replies
Mike,
I tried the same thing (Except for DbImage) on my windows box. I didn't face any problem. It must be related to your environment. We need to put some logs in SourceJob to find more details
When you create a queue, it should trigger async for both create and query. Is the create is going through fine?
#Sources
:sources:
BlobImage:
:poll_interval: 30
:queue: image_queue
:development:
:licensefile: settings/license.key
:redis: 127.0.0.1:6379
:syncserver: http://localhost:9292
:api_token: rhoconnect_api_token
Image model is as below
require 'active_record'
require 'fileutils'
class BlobImage < Rhoconnect::Model::Base
def initialize(source)
super(source)
end
def login
# TODO: Login to your data source here if necessary
end
def query(params=nil)
# TODO: Query your backend data source and assign the records
# to a nested hash structure called @result. For example:
# @result = {
# "1"=>{"name"=>"Acme", "industry"=>"Electronics"},
# "2"=>{"name"=>"Best", "industry"=>"Software"}
# }
# raise Rhoconnect::Model::Exception.new("Please provide some code to read records from the backend data source")
puts "inside query"
return {}
end
def create(create_hash)
# TODO: Create a new record in your backend data source
# raise "Please provide some code to create a single record in the backend data source using the create_hash"
puts "create_hash: #{create_hash}"
name = create_hash["image_uri"]
# filename we saved in application.rb#store_blob method
basename = create_hash["filename"]
puts "basename is #{basename}"
#DbImage.create(create_hash)
return create_hash["filename"]
end
def update(update_hash)
# TODO: Update an existing record in your backend data source
raise "Please provide some code to update a single record in the backend data source using the update_hash"
end
def delete(delete_hash)
# TODO: write some code here if applicable
# be sure to have a hash key and value for "object"
# for now, we'll say that its OK to not have a delete operation
# raise "Please provide some code to delete a single object in the backend application using the object_id"
end
def logoff
# TODO: Logout from the data source if necessary
end
def store_blob(object,field_name,blob)
# TODO: Handle post requests for blobs here.
# make sure you store the blob object somewhere permanently
# raise "Please provide some code to handle blobs if you are using them."
puts "fieldname: #{field_name}"
puts "store_blob: #{blob}"
root_path = "/"
file_name = blob[:filename]
puts "filename: #{blob[:filename]}"
# Save the file to the path
puts "class: #{blob[:tempfile].class}"
from = blob[:tempfile].path
to = "#{blob[:filename].to_s}"
puts "Copying #{from} => #{to}"
# Create folder path if it doesn't exist
#if (!File.directory?(to))
# FileUtils.mkdir_p(File.dirname(to))
#end
#Copy file
FileUtils.cp(from, to)
#Change file permissions so that it is readable by everyone
File.chmod(0666, to)
#Change group and owner to PlanetAdmin
File.chown(500, 500, to)
object['filename'] = blob[:filename].to_s
end
end
Thanks
Krishna