Provide a response object to remote worker scripts.
Glues together IronWorker and AWS S3 to provide a response object to remote worker scripts. This allows you to write massively concurrent Ruby programs without worrying about threads.
require "iron_response"
config = {...}
batch = IronResponse::Batch.new
batch.auto_update_worker = true
batch.config[:iron_io] = config[:iron_io]
batch.config[:aws_s3] = config[:aws_s3]
batch.worker = "test/workers/is_prime.rb"
batch.params_array = Array(1..10).map {|i| {number: i}}
results = batch.run!
p results
#=> [{"result"=>false}, {"result"=>true}, {"result"=>true}...]
Assumes you have a worker file called is_prime.rb
:
require "iron_response"
IronResponse::Responder.new(binding) do
def is_prime?(n)
("1" * n =~ /^1?$|^(11+?)\1+$/) == 0 ? false : true
end
{
result: is_prime?(params[:number])
}
end
Iron.io's IronWorker is a great product that provides a lot of powerful concurrency options. With IronWorker, you can scale tasks to hundreds and even thousands of workers. However, IronWorker was missing one useful feature for me: responses. What do I mean? In the typical IronWorker setup, worker files are just one-off scripts that run independently of the client that queues them up. For example:
client = IronWorkerNG::Client.new
100.times do |i|
client.tasks.create("do_something", number: i)
end
For many use cases, this is fine. But what if I want to know the result of do_something
? A simple way to get the result would be for your worker to POST the final result somewhere, then have the client retrieve it. This gem simply abstracts that process away, allowing the developer to avoid boilerplate and to keep worker code elegant.
On top of all this, another benefit to using this gem is that it makes it much easier to test workers.
Under the hood, iron_response
uses some functional and meta-programming to capture the final expression of a worker file, convert it to JSON, and then POST it to Amazon S3. When all the workers in an IronResponse::Batch
have finished, the gem retrieves the file and converts the JSON string back to Ruby.
This process means there a few important implications:
String
, Fixnum
, Hash
, and Array
.This gem requires a basic understanding of how to use IronWorker with Ruby.
Assuming you have an empy directory, called foo
:
$ mkdir workers
$ cd workers
$ touch my_worker.rb
my_worker.rb
should look like this:
require "iron_response"
IronResponse::Responder.new(binding) do
# your code here
end
To run this worker, create at the top-level of foo
files called configuration.rb
and enqueue.rb
:
configuration.rb
:
class Configuration
def self.keys
{
iron_io: {
token: "123",
project_id: "123"
},
aws_s3: {
access_key_id: "123",
secret_access_key: "123",
bucket: "iron_response"
}
}
end
end
Obviously, fill in the appropriate API keys. It is highly recommended that you do not use your AWS master keys. Instead, go to the AWS Console, click on "IAM", and create a user with a policy that allows it to edit the bucket named in the configuration file. Here's an example policy:
{
"Statement": [
{
"Action": "s3:*",
"Effect": "Allow",
"Resource": [
"arn:aws:s3:::iron_response",
"arn:aws:s3:::iron_response/*"
]
}
]
}
Now, write your queueing script:
enqueue.rb
:
require_relative "configuration"
require "iron_response"
config = Configuration.keys
batch = IronResponse::Batch.new
batch.auto_update_worker = true
batch.config = config
batch.worker = "workers/my_worker.rb"
# The `params_array` is an Array of Hashes
# that get sent as the payload to IronWorker scripts.
batch.params_array = Array ("a".."z").map {|i| {letter: i}}
results = batch.run!
If your worker code requires any gems, you can use iron_worker_ng
's API:
batch.code.merge_gem("nokogiri", "< 1.6.0") # decreases remote build time
batch.code.merge_gem("ecfs")
batch.code.full_remote_build(true)
Add this line to your application's Gemfile:
gem "iron_response"
And then execute:
$ bundle
Or install it yourself as:
$ gem install iron_response
git checkout -b my-new-feature
)git commit -am 'Added some feature'
)git push origin my-new-feature
)