class FogSite::Deployer
Used to actually execute a deploy. This object is not safe for reuse - the ‘@index` and `@updated_paths` stay dirty after a deploy to allow debugging and inspection by client scripts.
Attributes
Public Class Methods
# File lib/fog_site.rb, line 50 def initialize( site ) @site = site @index = {} @updated_paths = [] end
Run a single deploy. Creates a new ‘Deployer` and calls `run`.
# File lib/fog_site.rb, line 45 def self.run( site, options = {} ) deployer = Deployer.new( site ) deployer.run end
Public Instance Methods
# File lib/fog_site.rb, line 173 def assert_not_nil( value, error ) raise UsageError.new( error ) unless value end
Build an index of all the local files and their md5 sums. This will be used to decide what needs to be deployed.
# File lib/fog_site.rb, line 92 def build_index Dir["**/*"].each do |path| unless File.directory?( path ) @index[path] = Digest::MD5.file(path).to_s end end end
# File lib/fog_site.rb, line 153 def cdn @cdn ||= Fog::CDN.new( cdn_credentials ) end
# File lib/fog_site.rb, line 169 def cdn_credentials credentials.delete_if { |k, v| k == :region } end
# File lib/fog_site.rb, line 157 def connection @connection ||= Fog::Storage.new( credentials ) end
# File lib/fog_site.rb, line 161 def credentials { :provider => 'AWS', :aws_access_key_id => @site.access_key_id, :aws_secret_access_key => @site.secret_key }.merge @site.fog_options end
Compose and post a cache invalidation request to CloudFront. This will ensure that all CloudFront distributions get the latest content quickly.
# File lib/fog_site.rb, line 147 def invalidate_cache( distribution_id ) unless @updated_paths.empty? cdn.post_invalidation distribution_id, @updated_paths end end
Creates an S3 bucket for web site serving, using ‘index.html` and `404.html` as our special pages.
# File lib/fog_site.rb, line 78 def make_directory domain = @site.domain_name puts "Using bucket: #{domain}".blue @directory = connection.directories.get domain unless @directory puts "Creating nw bucket.".red @directory = connection.directories.create :key => domain, :public => true end connection.put_bucket_website(domain, 'index.html', :key => "404.html") end
# File lib/fog_site.rb, line 131 def mark_updated( path ) @updated_paths << path if path =~ /\/index\.html$/ @updated_paths << path.sub( /index\.html$/, '' ) end end
Validate our ‘Site`, create a and configure a bucket, build the index, sync the files and (finally) invalidate all paths which have been updated on the content distribution network.
# File lib/fog_site.rb, line 59 def run validate make_directory Dir.chdir @site.path do build_index sync_remote if( @site.distribution_id ) invalidate_cache(@site.distribution_id) end end end
Synchronize our local copy of the site with the remote one. This uses the index to detect what has been changed and upload only new/updated files. Helpful debugging information is emitted, and we’re left with a populated ‘updated_paths` instance variable which can be used to invalidate cached content.
# File lib/fog_site.rb, line 105 def sync_remote @directory.files.each do |remote_file| path = remote_file.key local_file_md5 = @index[path] if local_file_md5.nil? and @site.destroy_old_files puts "#{path}: deleted".red remote_file.destroy mark_updated( "/#{path}" ) elsif local_file_md5 == remote_file.etag puts "#{path}: unchanged".white @index.delete( path ) else puts "#{path}: updated".green write_file( path ) @index.delete( path ) mark_updated( "/#{path}" ) end end @index.each do |path, md5| puts "#{path}: new".green write_file( path ) end end
# File lib/fog_site.rb, line 71 def validate assert_not_nil @site.access_key_id, "No AccessKeyId specified" assert_not_nil @site.secret_key, "No SecretKey specified" end
Push a single file out to S3.
# File lib/fog_site.rb, line 139 def write_file( path ) @directory.files.create :key => path, :body => File.open( path ), :public => true end