PROJET AUTOBLOG


bfontaine.net

Site original : bfontaine.net

⇐ retour index

Mise à jour

Mise à jour de la base de données, veuillez patienter...

Use a Custom Tld for Local Development

lundi 26 août 2013 à 22:31

In this post, we’ll create a custom TLD for local development, and configure Apache to work with that. It’ll allow you to work on your local version of mysuperwebsite.com with the local domain mysuperwebsite.dev, with the exact same URLs, except that little .com which is replaced by .dev.

This post is focused on OS X, but the tools used are available on Ubuntu and others. Additionally, you can choose whatever unexisting domain you want, just replace .dev with the one you chose in the commands described in the article.

DNSMasq

The major problem we have when we develop locally with custom domains is that we have to add an entry to /etc/hosts for each website. Wildcards are not supported, so you can’t write the following line in it:

127.0.0.1 *.dev

There are different workarounds, but here we’ll use DNSMasq as a local DNS resolver. If you don’t have homebrew, install it first, then install DNSMasq:

brew install dnsmasq
# enable the daemon on startup
sudo cp $(brew --prefix dnsmasq)/homebrew.mxcl.dnsmasq.plist /Library/LaunchDaemons/

DNSMasq will run locally and redirect any query for a *.dev domain to the local host, 127.0.0.1. Open /usr/local/etc/dnsmasq.conf (or create it if it doesn’t exist) and add the following lines:

address=/dev/127.0.0.1
listen-adress=127.0.0.1

Then start DNSMasq:

sudo launchctl load homebrew.mxcl.dnsmasq

You’ll have to tell OS X to send its DNS queries to the local server first, then try the other ones, to intercept the queries for the *.dev domains. Go to System Preferences → Network → Advanced → DNS, and add 127.0.0.1 at the top of the list of DNS servers.

You can then check that it works using dig:

$ dig foobar.dev
…
;; QUESTION SECTION:
;foobar.dev.            IN  A

;; ANSWER SECTION:
foobar.dev.     0   IN  A   127.0.0.1

;; Query time: 0 msec
;; SERVER: 127.0.0.1#53(127.0.0.1)
…

If you have some server listening on port 80, you can try foobar.dev in your browser, it’ll display whatever you’re serving on 127.0.0.1:80. If you have any troubles, empty your cache (use dscacheutils -flushcache on OS X), or restart your computer.

That’s all! You can stop there, but if you’re using Apache you may be interested by the next section of this article.

mod_vhost_alias

This Apache module allows you manage virtual hosts dynamically, so you won’t have to create a new one for every (local) website. This is useful if you have a large number of virtual hosts with similar configurations. In this section, we’ll see how to associate a .dev domain with a directory on your computer. We’ll assume you already have Apache installed and working.

With mod_vhost_alias, Apache extracts the hostname from the client query (with the Host HTTP parameter) and use it to get the directory path. The official doc has a lot of examples, I personally prefer to be able to use whatever directory name I want, so I’m using a vhosts directory which contains symbolic links to the right directories. Like with DNSMasq before, you need to add only two lines of configuration here. Open /etc/apache2/extra/httpd-vhosts.conf (you’ll need to use sudo), and add the following lines at the end of it:

1
2
UseCanonicalName Off
VirtualDocumentRoot "/Users/baptiste/some/dirs/vhosts/%-2.0.%-1.0"

You may want to customize the path. This one takes the last two parts of the domain (i.e. the domain and the tld), and use the directory ~/some/dirs/vhosts/domain.tld/ for it. Note that it’ll use the same directory for bar.dev, foo.bar.dev, qux.foo.bar.dev, etc.

Then, use ln -s to make symbolic links from vhosts/ to the right directories, e.g.:

~/some/dirs/vhosts/foo.dev -> ~/sites/foo.dev
~/some/dirs/vhosts/bar.dev -> ~/perso/blog

If all the websites are in the same directory, you can skip the “symbolic links” part. Restart Apache with sudo apachectl restart (on Ubuntu, use sudo service apache2 restart), and you’re done. In the future, you won’t have to restart Apache for each new site, you need to restart it only when you modify its configuration.

If you get some issues with rewrite rules, add

1
RewriteBase /

in the .htaccess of the websites that use them, and it’ll work.

Get RFCs in Your Terminal

dimanche 11 août 2013 à 12:16

When working with Internet protocols, we have to read RFCs a lot. They can be found on the Web, but it’s better to have them directly in the terminal. Ubuntu provide some packages to have them offline, but if you aren’t a sudoer, you can’t install them with apt-get. So I needed a little script to fetch RFCs from IETF’s website and read them locally.

Here comes rfc

rfc was initially a small Bash script (~5 lines) that cURL-ed RFCs and displayed them with less. I used it for the Networking class at Paris Diderot. A few weeks ago, I enhanced it with a local cache (it now download an RFC the first time only), and an offline search feature. Thanks to ecksun, it can also be used to read drafts. The script works pretty much everywhere, and is really simple to use:

$ rfc <number>

For example, get the RFC 6749 (OAuth 2.0) with:

$ rfc 6749

That’s all! Since it’s just plain text, you can pipe it or redirect its output to anything:

$ rfc 42 | lolcat        # rainbow RFC
$ rfc 4534 > rfc4534.txt # local copy

Install

Since that’s a standalone Bash script, you can put it where you want, but the directory must be in your PATH. Here is a basic install:

mkdir -p ~/bin
curl https://raw.github.com/bfontaine/rfc/master/rfc > ~/bin/rfc
chmod u+x ~/bin/rfc

If you don’t have ~/bin in your PATH, add this line in your ~/.bashrc:

export PATH="$HOME/bin:$PATH"

The only requirements are a pager (less is the default, but it’ll use $PAGER if it’s set) and curl (it’ll use $CURL if it’s set, and fallback on wget if curl can’t be found).

For more info, check the project on GitHub.

Get RFCs in Your Terminal

dimanche 11 août 2013 à 12:16

When working with Internet protocols, we have to read RFCs a lot. They can be found on the Web, but it’s better to have them directly in the terminal. Ubuntu provide some packages to have them offline, but if you aren’t a sudoer, you can’t install them with apt-get. So I needed a little script to fetch RFCs from IETF’s website and read them locally.

Here comes rfc

rfc was initially a small Bash script (~5 lines) that cURL-ed RFCs and displayed them with less. I used it for the Networking class at Paris Diderot. A few weeks ago, I enhanced it with a local cache (it now download an RFC the first time only), and an offline search feature. Thanks to ecksun, it can also be used to read drafts. The script works pretty much everywhere, and is really simple to use:

$ rfc <number>

For example, get the RFC 6749 (OAuth 2.0) with:

$ rfc 6749

That’s all! Since it’s just plain text, you can pipe it or redirect its output to anything:

$ rfc 42 | lolcat        # rainbow RFC
$ rfc 4534 > rfc4534.txt # local copy

Install

Since that’s a standalone Bash script, you can put it where you want, but the directory must be in your PATH. Here is a basic install:

mkdir -p ~/bin
curl https://raw.github.com/bfontaine/rfc/master/rfc > ~/bin/rfc
chmod u+x ~/bin/rfc

If you don’t have ~/bin in your PATH, add this line in your ~/.bashrc:

export PATH="$HOME/bin:$PATH"

The only requirements are a pager (less is the default, but it’ll use $PAGER if it’s set) and curl (it’ll use $CURL if it’s set, and fallback on wget if curl can’t be found).

For more info, check the project on GitHub.

Get RFCs in Your Terminal

dimanche 11 août 2013 à 12:16

When working with Internet protocols, we have to read RFCs a lot. They can be found on the Web, but it’s better to have them directly in the terminal. Ubuntu provide some packages to have them offline, but if you aren’t a sudoer, you can’t install them with apt-get. So I needed a little script to fetch RFCs from IETF’s website and read them locally.

Here comes rfc

rfc was initially a small Bash script (~5 lines) that cURL-ed RFCs and displayed them with less. I used it for the Networking class at Paris Diderot. A few weeks ago, I enhanced it with a local cache (it now download an RFC the first time only), and an offline search feature. Thanks to ecksun, it can also be used to read drafts. The script works pretty much everywhere, and is really simple to use:

$ rfc <number>

For example, get the RFC 6749 (OAuth 2.0) with:

$ rfc 6749

That’s all! Since it’s just plain text, you can pipe it or redirect its output to anything:

$ rfc 42 | lolcat        # rainbow RFC
$ rfc 4534 > rfc4534.txt # local copy

Install

Since that’s a standalone Bash script, you can put it where you want, but the directory must be in your PATH. Here is a basic install:

mkdir -p ~/bin
curl https://raw.github.com/bfontaine/rfc/master/rfc > ~/bin/rfc
chmod u+x ~/bin/rfc

If you don’t have ~/bin in your PATH, add this line in your ~/.bashrc:

export PATH="$HOME/bin:$PATH"

The only requirements are a pager (less is the default, but it’ll use $PAGER if it’s set) and curl (it’ll use $CURL if it’s set, and fallback on wget if curl can’t be found).

For more info, check the project on GitHub.

Open Spotify links with the Desktop client

mardi 23 juillet 2013 à 21:55

Spotify’s Web app is great, but you may prefer to use the desktop client. The problem is that the links default to the Web app. Here is a quick tip to make what you want.

First, install Switcheroo. It’s a Chrome extension (Firefox users, go here) that allows you to setup custom redirect rules for any http request, using a string replacement. Then, add a rule to replace http://open.spotify.com/ with spotify://.

That’s it! Now, all open.spotify.com/something links will open in the desktop client instead of the Web one.

Note: I found the original tip here, but it was not working because the author suggested to replace with spotify instead of spotify://.