AWS CloudFront + S3 + Allow all CORS

If you’ve ever set up CloudFront with CORS you know that it can be a little tricky to get right. In fact, the configuration AWS recommends makes it incredibly difficult to take full advantage of the CloudFront caching while still honoring CORS.

If you have some specific set of allowed origins with different CORS rules, then forwarding the Origin header makes sense, but what about the case where you simply want to allow CORS for all origins? If you forward the Origin header it reduces the abilit to cache the response. If you want to support Simple Requests to skip the OPTIONS request AWS may not even return the proper CORS headers at all. I’ve even seen cases where if a GET request was made to a URL (like an image) without CORS then CloudFront would cache the non-CORS version of that resource, even if later requests added the Origin header. Of course this probably means I didn’t set up my CloudFront caching “exactly just right”, but why make things complicated?

Here’s one weird trick to set up CloudFront to allow all origins and cache aggressively, even on simple requests. Explicitly set the Origin custom header in the CloudFront distribution.

Origin https://www.example.com

That’s it. Now every response will always have the CORS headers because when CloudFront forwards the request to S3 it includes the Origin header. Even on simple requests. You can probably put in any origin domain you want, but www.example.com is just what I used.

Daniel X Moore Talks about HyperDev on The New Stack @ Scale Podcast

Daniel X just spoke on a podcast about HyperDev, organizational structure, and agility, check it out!

The New Stack @ Scale Podcast

“It’s easy to think that developer tools to make life easier for developers. It’s actually a lot broader than that. What we found when developing HyperDev is, as the barrier gets lower and lower, more people in the organization, people you might not traditionally think of as developers, are able to contribute, are able to build applications, are able to solve their own problems.” – Daniel X Moore

GitHub Pages Custom Domain with SSL/TLS

The Overview

Route53 -> CloudFront -> github.io

You’ll get the joys of having SSL/TLS on a custom domain https://danielx.net backed by the ease of deployment and reliability of GitHub Pages.

The Price

  • Route 53 ($0.50)
  • CloudFront (pennies!)
  • SSL/TLS Cert (free!)

The Details

Get the certificate for your domain at https://aws.amazon.com/certificate-manager/. Be sure your contact details on the domain are up to date because Amazon uses whois info to find out where to send the confirmation email. I like to request a certificate for the wildcard as well as the base domain, i.e. *.danielx.net and danielx.net, that way I can use the same certificate if I want to have other CloudFront distributions for subdomains.

Screenshot from 2016-02-09 15:45:08

You’ll need to click through the links Amazon emails you so that they can validate your ownership of the domain and activate the certificate.

Next, create your CloudFront distribution. Choose “Web”. Configure your origin, in my case strd6.github.io. Choose “HTTPS Only” for Origin protocol policy, that way CloudFront will only connect to you GitHub pages over HTTPS.

Screenshot from 2016-02-09 15:55:14

Configure the caching behavior. Here I add OPTIONS to the allowed requests, I’m not sure if this is necessary since GitHub pages enables CORS by adding the Access-Control-Allow-Origin: * header to all responses. You also may want to customize and set the default TTL to zero. GitHub sets a 10 minute caching header on all resources found, but won’t set a header on 404s. This will prevent CloudFront from caching a 404 response for 24 hours (yikes!)

Screenshot from 2016-02-09 16:03:20

Here’s where we add our certificate. Be sure to set up the CNAME field with your domain, and be sure your certificate matches!

You’ll also want to set the Default Root Object to index.html.

Screenshot from 2016-02-09 16:13:28

You can also add logging if you’re feeling into it.

If your domain is hosted somewhere else you can transfer your DNS to Route53, otherwise you can set up the DNS records on your domain provider.

Create a Route53 Record set for your domain then create an A record. Choose Alias, and select the CloudFront Distribution as your Alias target. Note: you may need to wait ~10-15 minutes for the distribution to juice up.

Screenshot from 2016-02-09 16:17:53

Caveats

You need to be careful with your urls (you’re careful with them anyway, right?!). You must include the trailing slash like https://danielx.net/editor/, because if you don’t and do https://danielx.net/editor GitHub will respond with a 301 Redirect to your .github.io domain, and it won’t even keep the https!

If you hit a 404 CloudFront may cache the response for up to 24 hours with its default settings. This is because GitHub doesn’t set and caching headers on 404 responses and CloudFront does its default thing.

Using Multiple Cloudfront Domains with Paperclip

In order to speed up asset loading using a CDN is generally regarded as a good idea. It is also recommended to split up requests among separate hostnames to allow the browser to parallelize loading.

Enabling this in Rails with Paperclip is pretty easy, though the documentation isn’t extremely rich.

You’ll want to set the s3_host_alias option to a proc which determines the correct domain alias based on the id of the object the attachment is for.

  has_attached_file :image, S3_OPTS.merge(
    :s3_host_alias => Proc.new {|attachment| "images#{attachment.instance.id % 4}.pixieengine.com" }, 
    :styles => {
      ...
    }
  )

This sends requests to the following hostnames:

images0.pixieengine.com
images1.pixieengine.com
images2.pixieengine.com
images3.pixieengine.com

The best part is that the same image will always have the same hostname. I’ve seen some people suggest randomly choosing a domain, but that reduces caching potential as the same item could be requested from multiple different domains over time.

The Advantage of Code Based Game Development Environments

Game development environments that leverage graphical interfaces and parameterized editors are interesting. They have the ability to lower the bar required to get a game up and running without making serious mistakes or getting lost in dead ends. Therefore they are a valuable tool in broadening the population of game developers.

However, there always remains the need for the capability to drop into the source code and edit the algorithms directly. Data structures and algorithms are what software is made of, and if our only interface into game creation is a parameterized editor where we can only configure values, then it will prevent breakthroughs just as much as it prevents failures and dead ends. This is why that no matter how many wizards, GUI tools, application builders, etc. that we have, we must always be able to go to the source and edit.

True progress is born from changing the paradigm, not changing the parameters.

Start Writing Your Blog

Do you have a website? Do you have a blog? I’d totally love to read it.

The thing is though, that it takes a while for it to get good. I started with STRd6 almost two years ago, and it’s still not good. On the plus side though it adds incentive to keep cranking out the content. Each new article adds a new layer of better content to cover up the old.

So what are you waiting for? If you want a really high class blog then you’ll definitely need to start today. I bet you even have some ideas for blog posts saved up. You could easily write four in the first month.

So go ahead and do it already. You don’t need to worry about being better than the blogs that are already out there, just as good as they were when they started. I’m not even going to link back to my early posts… too embarrassing. And only 20 posts in 2008? You can totally beat that.

Add your new blog in the comments. I’ll check it out in a few years when it’s good.

Onion Eyes

The onion reminds me of my frailty. This simple vegetable brings me to tears and there is nothing I can do about it. It’s easy to forget how fragile I am. The best technology can do is to hide my vulnerability; the worst is to allow me to forget it’s truth.

Billions of years of serendipity have allowed me the grace of existence in this world. Meanwhile it is all I can do to rise slightly above the chaos, knowing the whim of chance can take it all away. I come from a long evolutionary line of stupendous bad-asses. So do onions. All the denizens of this world do. Except for the most powerful; they come from nothing, and are all around us.

The onion reminds me to strive for clarity. To persist through the fog. To try harder. I now have a delicious dinner.

i see _why, he wanted one in his book

The Chasm of Compromise

You and your friend are trying to decide whether or not to jump across a chasm. You want to jump across to the other side where you will be rewarded with delicious berries. You friend wants to stay put and eat the so-so berries here. So you decide to compromise: you each jump halfway across the chasm and die.

Not all situations are chasms, but the really important ones usually are. Compromise can kill you; go for consensus if you can. If you can’t, go alone.