Chances are you are in need of landing page hosting.

If you need something simple and visual to get started, check out Landerlab. If you are looking for something free or more technical (i.e. getting into the code), read on.

Unless you're using something like Leadpages, Wix, Wordpress etc., you'll likely be using static landers -- that is, landers composed of HTML, CSS and JS files.

These types of files are all "client-side" and can be cached by CDNs and delivered quickly, so are great for affiliate marketing landing pages used in campaigns across the globe.

...but, they can be more annoying to manage than a simple visual editor on some platform.

If you're an affiliate marketer and you want a streamlined setup that:

  • Is free

  • Is fast and has high performance delivery of pages

  • Is quite automated and flexible

  • And lets you automate part of your tracking setup...

...then continue reading and I'll show you how to get it set up from A-to-Z with GitHub, Netlify, Cloudflare, and automated tracking with FunnelFlux Pro.

This setup will make it simple to manage 100's of landers, you can replicate it across unlimited domains, you don't have to wait for slow FTP, and you'll get automated minification, image recompression and ridiculous fast page delivery.

This guide is designed for newbies and goes through things step-by-step, hence the length of it (if you're already quick technically skilled, just skim).


Getting Started

Before we begin, you'll need several accounts:

  1. A GitHub account

  2. A Netlify account

  3. A Cloudflare account and your lander domain added to it (chances are you'll be managing your tracking domain in the same place)

  4. A text editor like SublimeText, Notepad++, Atom, etc. My personal preference is Atom and I think it's default settings are better for a newbie to Git. So if you aren't particularly used to editing HTML, CSS etc., go get Atom too.

If you don't have any of these, go and sign up for an account now. They are all free.


Get a basic Git repository set up

GitHub is a version control system. It's one of several that developers use to collaborate on software development and commit code. It's also very useful for basic website creation, as it lets you keep track of changes.

Firstly, go set up a GitHub account.

Once logged in, on your dashboard you should be able to create a new repository:

Name the repository something sensible like "my-affiliate-landers". Choose private. I would tick to add a readme, a .gitignore and a license, just to populate some "stuff" in the repository. It doesn't matter what you pick for the ignore/license.

You're going to need a different repository for different domains you use (with separate content), so the domain itself might be a good name too -- and I would recommend this if you decide to use this Netlify approach long term.

Now, click create repository.

Now that your repository is created, we'll need a way to access it remotely. For this we want a desktop application. If you already have one you like, use that, otherwise go download GitHub Desktop - https://desktop.github.com/

Once that is installed, we want to clone the repository into the app so we can edit everything on our PC directly.

Click the repository in GitHub (the site) to get to the base page of the repository. Then click the Code button > Open in GitHub desktop:

If you are on the page right after creating your repository, you can also do it directly from there:

This should trigger an alert to open in the GitHub desktop application. Click yes and it should open the app and a prompt like this:

Here, all you need to do is set the local path to some sensible directory. I would recommend making a clear folder in your working directory somewhere that's easy to access and not somewhere deep... e.g. C:\GitRepos\ is better than a folder buried in a dozen other folders (for various reasons).

So not like I have above where its inside some user account's documents folder.

Because this is going into a versioning system, don't worry about putting this inside something like Dropbox. There's no sense in double-syncing your content, and your Dropbox syncing could cause conflicts with GitHub, so avoid it.

Click CLONE and go. It should show a progress bar where it is cloning the repository, downloading objects, etc.

Once pulled, you'll notice in the top toolbar you have the current repository (use this pane for switching repositories), current branch, and an action button:

Here, just stick to master branch for now. We don't need to worry about a develop branch, pull requests and testing. For now lets keep it simple and have everything go live immediately -- you can make it nicer later.

And that's it! You now have a local Git "repo" set up where you can store all your landing page content and make changes.


Get your Cloudflare account ready

Not much to do here, but we need your domain and Cloudflare account to be ready.

Set one up, add your lander domain to it. If you're using track.domain.com with FunnelFlux and have your domains on domain.com, chances are you're already ready to go.

If not, you can check our domain setup guide and go through the basics of Cloudflare. You'll need to add the domain, then change the "nameservers" with your registrar (GoDaddy, Namecheap etc.) to point to Cloudflare.


Initial Netlify setup

Netlify is awesome -- what they provide for free is extremely cool.

Think of Netlify as a system that will automatically deploy your Github repo content to live hosting every time you commit changes. Its much easier and more automated than FTP and comes with a number of other advantages.

Head over to Netlify and sign up for a free account.

It will ask you various things as you sign up, e.g. a team, just whisk through it and get inside the dashboard. It should look something like this:

Feel free to browse around -- Plugins has a number of helpful plugins that can be used later, domains lets you move domains over to Netlify for automatic management, and so on. I would not recommend moving your domains over unless they are used only with Netlify, not with tracking or anything else.

So, let's get going -- click New site from Git.

Click GitHub as your continuous deployment provider:

A window will open asking you to authorise access to your GitHub account. Give it the permissions it needs. Once done, you can click a repository to add:

On clicking this it will ask you for final site settings -- just pick main branch and go, there's nothing further to change.

Once added it should redirect you to the site dashboard:

Once it fully deploys, you should have an app URL underneath the project name, e.g.

https://elastic-nightingale-2df237.netlify.app/

Click this and load it!

Did it work? Probably not, because we have nothing in our repository, right?

So, lets' put something in the repository first and commit our first change.

Go to the folder you set for your repository on your computer. Mine looks like this:

Create a new index.html file, open it in a text editor. Put the following code and save:

<!DOCTYPE html>
<html>
<head>
<title>My Test Page</title>
</head>
<body>
<p>Nice, my first site via Netlify and GitHub!</p>
</body>
</html>

Now, when you save, the GitHub application is going to detect the file changes. Head back to GitHub desktop. You should see something like this:

When you make changes to your repository, you should see a list of changed files in the changes tab. If you click the files, it will show a change log with green for additions and red for deletions.

Now, we want to commit this change to the main branch (which we are already on).

To do that, type a title for the commit like "Initial index file" then click commit to main. Once you do this, your changes are "committed" locally, and you need to push them to the GitHub repository to formally save them.

Think of commits like draft saves, where you're saving various modifications and splitting them into chunks. Then when you're ready, you formally commit them to the repository, using one of these buttons:

Click it and go! Now, quickly switch to your Netlify dashboard.

It's integrated with GitHub so is listening for all commits. It will build very quickly on this occasion, so you might have already missed it. You should see your commit in the build list:

Now, load your app's URL again.

Hopefully... you should see your index file load with your "Nice, my first site via Netlify and GitHub!" content.

Well done, you've just created a full repository > live site integration with Netlify :-)


Setting up a custom domain with Netlify

You're going to want to use a custom domain to point to your Netlify app as well as use Cloudflare for caching of content.

To do this, first get your domain added to Cloudflare. Now on your Netlify dashboard, go to domain settings:

Click add custom domain, then put the domain you want to use. You can put domain.com or sub.domain.com. Click verify.

When it asks if you are the owner, click yes.

It should redirect back to the custom domains section, now with your domain and a www subdomain added, and a check DNS configuration link.

Click this link to load the details we need:

We will use the alias approach. Head over to Cloudflare and add a new CNAME record that points your domain to the netlify app address they provide, like so:

For now, keep this in DNS only mode, i.e. gray cloud not orange cloud. We want to use orange cloud mode to use Cloudflare's CDN capabilities, but if we turn it on now it will interfere with Netlify verifying the domain.

Netlify will try to verify the domain automatically when you refresh the page, so there is nothing to do but wait until that "Check DNS configuration" alert goes away after a few minutes.

Lets also add a "www" record to satisfy them too. You can use the following approach (of course, use your domain name):

Once the check DNS alerts are gone, let's turn on Cloudflare's proxying to get full CDN capabilities.

For both the records you added, switch them to orange cloud mode:

Here you can see I have a "trk" subdomain record that points to FunnelFlux.

The easiest way to make this Cloudflare account work well for landing pages + tracking is to set account-level settings for the landing pages, then a single page rule for the trk subdomain that ensures it works well with FunnelFlux.

Head over to the page rules section. If you have a tracker record present, make sure you have a page rule following our guide that basically says don't cache anything, use flexible SSL, turn all security off, all performance off, etc.:

Now, lets go through the account settings and make sure we are maximising the performance enhancements for our landers:

  • SSL > Overview > set to Flexible as the default

  • SSL > Edge Certificates > turn on Always Use HTTPS, Opportunistic Encryption, TLS 1.3 and Automatic HTTPS Rewrites. This way we force HTTPS as much as possible, which is good for tracking/reliability too

  • Speed > Optimization > turn on Auto-minify of everything, Brotli, and Rocket Loader

  • Caching > Configuration > set the default caching level to Standard

  • Network > make sure all toggles here are set to on (leave the pseudo IPv4 off)

Ok, that's all those settings sorted -- you probably didn't have to change much, but I wanted to be thorough.

Now, test it!

Our goal was to use a custom domain and SSL -- so test it!

Load your custom domain, e.g. http://secretlife.news/

Hopefully you'll see your HTML page and message and notice that the URL should redirect to HTTPS and show that its secure, since we set up automatic HTTPS rewrites for the domain:


Automating your FunnelFlux Pro Tracking

There are two great ways to make your lander/offer javascript tracking automated with FunnelFlux Pro, which leverages the fact that the page ID is the main dynamic part that needs to be changed for each page:

  1. Use Google Tag Manager and a Variable (lookup table) that maps URLs to page IDs, so that GTM injects the right page ID based on the URL loaded

  2. Embed page ID in each page manually, then automatically inject generic FunnelFlux Pro javascript into any page

Here we are going to use option #2 to streamline things.

Firstly, head to FunnelFlux Pro and go to the landers section. Click column settings then turn on resource ID:

This column will show the page ID for each lander. You can double click it to copy:

Now, my approach here is to embed this ID on every one of your landers manually, to declare the page ID, then let automations handle the rest.

We do this by adding this to the top of your pages:

<script>var ff_page_id = 'THE_RESOURCE_ID_HERE';</script>

So when you add a lander (or offer you control) in FunnelFlux Pro, get its resource ID and add it to your page code in the top of <head>, like so:

<!DOCTYPE html>
<html>
<head>
<script>var ff_page_id = '0Uhinf8baM2X';</script>
<title>My Test Page</title>
</head>
<body>
<p>Nice, my first site via Netlify and GitHub!</p>
</body>
</html>

Now when the page loads, JS executing on the page can use ff_page_id to know the page ID.

To demonstrate this, I'll save the page > commit the changes, push to the main branch and let Netlify build it. Once I see it deployed, I'll load my page again and hard refresh it (as its being cached by Cloudflare, I need to make sure it loads the fresh content -- we will add automatic purging later!).

I then hit F12 to open the console and typed "ff" in it. I can see the ff_page_id variable appears (autocomplete) and has the expected value:

This tells me that my JS is working as expected.

Now, let's deal with FunnelFlux Pro tracking. If I go to edit lander I can get the header include codes, which look like this:

Global header include:

<script data-cfasync="false" src="https://trk.secretlife.news/integration/lumetric.js?2.2.0"></script>
<script>
var lum = new Lumetric();
</script>

View tracking:

<script>
lum.event('view', {
'query': {
// 'f': '{FUNNEL-ID}',
// 'n': '{NODE-ID}',
'p': '0Uhinf8baM2X',
// 'ts': '{TRAFFICSOURCE-ID}',
// 'c': '{COST}'
},
'options': {
'cookieAllowed': true,
'urlRewrite': true,
'timeOnPage': false,
'resolveTokens': [],
},
'onDone': function (response) {}
});
</script>

Here, all I am going to do is remove the commented parameters (f, n, ts, c) since they are not being used anyway, and replace page ID with the JS variable that is declared early in the page:

<script>
lum.event('view', {
'query': {
'p': ff_page_id,
},
'options': {
'cookieAllowed': true,
'urlRewrite': true,
'timeOnPage': false,
'resolveTokens': [],
},
'onDone': function (response) {}
});
</script>

Hopefully you can see how this works without needing to know Javascript.

Instead of declaring the page ID in the JS, we manually declare it in the page code, then have generic JS that could be included automatically on any page for tracking.

And that's exactly what we will do.

Head over to Netlify, go to site settings > build & deploy > post processing:

Time for some magic!

Click on snippet injection and add the global header include, which should come before </head>:

Save, then add a new snippet before </body> with our generic view tracking code:

Now, let's test it.

I will first add this as a new page to my landers in FunnelFlux Pro:

On save, I will get the newly created page ID from the table:

I'll go change this in my index.html file:

And of course, commit and push the changes via my GitHub Desktop client:

This will trigger the build in Netlify, which now has post processing config that injects our FunnelFlux Pro tracking code.

If we reload the site, we can check this in two ways:

Check the page source code

If I right click > view page source, I can see all my code is present:

Check the network console

Here, I can go to the network tab in web developer tools and refresh. If my automation is working correctly, we should see the FunnelFlux Pro JS loading, which triggers two "funnel" requests:

Nice! We are almost there!

But, the tracking didn't actually work... You can tell by clicking the funnel request row (not the preflight one) and then the response tab:

Getting a bit technical hehe, but this shows the JS loaded and executed fine, but the system basically said ¯\_(ツ)_/¯

Why?

FunnelFlux knows the page ID, but has no idea what funnel this page is meant to be tracked in. The system can't assume, as your page could be used in a dozen different funnels.

The solution is just to generate a direct tracking link for this page via the funnel builder, which will include funnel, traffic source and other parameters in the URL. Or a redirect URL will work fine too. Either one provides context that the JS then uses.

So you might then ask...

What's the point of embedding page ID if we need to pass data in the URL anyway?

Very good question, and if you asked it then you probably have a good handle on what's going on here.

We don't actually need to dynamically embed any ID if our incoming URLs always have the data we need, which will be the case if you generated links properly within the funnel builder that you then use in ad campaigns.

But I was preparing you for an important situation...

What happens if someone clicks to the next page then uses the back button, and tracks a view on the previous page? What if they click from page to page via your direct links and go back to a page without any URL params now?

Users do weird things, and I like to make tracking robust. Embedding page ID helps reduce the risk of issues here.

And what if someone visits the page directly without any URL params at all? Such as in the case of organic traffic, referrals, or a live static website?

In this case, you can add a default funnel ID to your page and the JS as well.

This way any visitors will automatically get tracked correctly as the tracker knows the page ID AND the funnel ID to use.

You'd do that like so:

<!DOCTYPE html>
<html>
<head>
<script>
var ff_funnel_id = 'FUNNEL_ID_HERE';
var ff_page_id = '0wzUG9Wgw7tI';
</script>
<title>My Test Page</title>
</head>
<body>
<p>Nice, my first site via Netlify and GitHub!</p>
</body>
</html>

Then in your JS snippet in Netlify build settings:

<script>
lum.event('view', {
'query': {
'f: ff_funnel_id,
'p': ff_page_id,
},
'options': {
'cookieAllowed': true,
'urlRewrite': true,
'timeOnPage': false,
'resolveTokens': [],
},
'onDone': function (response) {}
});
</script>

Here, funnel ID would be the default funnel you want visits to go to. You can get this from the table as with landers, and can see the funnel ID in the URL of the funnel editor.

And that's it... fully automated JS tracking in all your landers.

If you're running straightforward affiliate campaigns, you don't really have to do much -- just generate links, use automatic JS embedding via Netlify and go.

If you are expecting significant organic traffic, embed the page ID as well as a default funnel ID.

Just be aware, if you are injecting funnel ID into our view JS as well, you must always declare a funnel ID -- otherwise our JS will break, as it tries to use a variable that doesn't exist.




Supercharging landing page delivery performance

If you're running push/pops/native and doing high volume, running mobile, or countries with poor connectivity, then its critical that your landers load quickly.

Here I will show you how to extend this setup to be STUPID FAST.

We'll do this by making Cloudflare cache everything, including the HTML pages, AND invalidate caching every time we make updates.

We want to make cold loads as fast as possible.

There's a few things we need to do here:

  1. Set Netlify to optimise images

  2. Update Netlify settings to not override caching info (i.e. we cant let Netlify's defaults override Cloudflare behaviour)

  3. Set a page rule in Cloudflare to cache everything

  4. Automate Cloudflare cache purging by Netlify

Let's get started, and we can do a before and after test.

Firstly, a test lander

This is some basic product page I got from some network. It has about 85 files in it and is 2.7 MB on disk, and has some poorly optimized images.

I'll run some load tests. I will do this from a non-US location as I don't want to see the results of some US server pulling content from another US server.

This doesn't give us a good real-world assessment of our performance improvements from a CDN -- I want to see this for a client from Asia loading content from a US server, for example.

So, I will use GTMetrix and set it to use Hong Kong, Chrome, and a connection throttle of Broadband Fast (20/5 Mbps, 25ms).

Before - GTMetrix

  • 2.3 MB in total content

  • About 4.2 second full load time

  • Notice it took about 700 ms to get the page itself before anything started loading

  • The images took some time to load, since they weren't yet cached at the edge and with the throttling. So I show a repeat test below too.

Test 1:

Test 2:

Well, it got worse haha, so its clear there are big improvements to make.


Time to supercharge things


Step 1 - Optimise Images with Netlify

Head to site settings > build & deploy > post processing. Under asset optimisation, turn on everything except pretty URLs:


Step 2 - Override Netlify caching headers

This one I figured out through testing - Netlify would set caching headers on assets so that when you deploy rapidly, caching doesn't get in the way of seeing your live changes.

But we want maximum performance, not maximum responsiveness, as we can always purge the Cloudflare cache ourselves.

To do this we need to add a "toml" config file to the root directory of your repository.

Go there now and create a file netlify.toml and add this content:

[[headers]]
for = "/*"
[headers.values]
cache-control = '''
max-age=604800,
s-maxage=604800,
public,
immutable'''

So it should be here:

This config is quite aggressive and really relies on the Cloudflare purging to work properly -- otherwise your changes will never appear!

Push this update to your Git repository. This file is read by Netlify during builds and overrides default behaviour. Above, we are specifiying new headers for JS, CSS, JPG and PNG files to makes sure they cache properly.


Step 3 - Make Cloudflare cache everything

Now, head to Cloudflare and the Rules section. We want to create a rule that says "Cache everything!"

It's quite simple and looks like this:

When I save this, I now need to make sure that its priority is BELOW my tracker record, so that I don't mess with it - i.e. the tracker rule should be #1


Step 4 - Make Netlify auto-purge Cloudflare

This caching is great, but we still want to see our live updates when testing right...

So to do this, we make use of one of the Netlify plugins.

Click on the plugins menu item and go to the directory. Search for Cloudflare. You should see the "Cloudflare cache purge" plugin. Install it.

Once installed, we need to do some tech setup, which is described in their help docs here - https://github.com/chrism2671/netlify-purge-cloudflare-on-deploy#readme

Get your Zone ID

Go to the Overview tab of your site in Cloudflare, scroll down to the API box:

Open a text editor and copy/paste the Zone ID there.

Get an API token

Click the "Get your API token" link as in the image above.

Click on the API Tokens tab, then create token, then "Get Started" under Custom Token.

Name it "Netlify Purge Cache Token" and set the permissions to allow cache purging, like below:

Click Continue to Summary > Create Token

On this page, copy the token to your text editor. Note you will not be able to see this token again, so make sure you don't close this page until you have it safely, or you will need to recreate it.

Add Netlify Environmental Variables

Now that we have the Zone ID and API token, in Netlify, go to site settings > build & deploy > environment -> environment variables and set up:

  • CLOUDFLARE_ZONE_ID

  • CLOUDFLARE_API_TOKEN

Like so:

Save this!

We are now good to go and can test things.

So, now I would make any simple change to the repository and commit a change, just to trigger the rebuild. A single letter change is fine.

The site will start building and we can click the build log to see what's happening under the hood:

If you go through the build log, you should see it installing the Cloudflare cache plugin, succeeding in purging, and doing a bunch of post processing including on images (which we definitely want).


Testing the results

Let's compare GTMetrix tests now...

Test 1:

Much better than before - remember the previous load times were ~700 ms for the page then 4-7 seconds for the total content.

When I check individual assets now I can see the headers are better. I just rebuilt the site, so I expect the cache to be purged and the second attempt to now resemble real-world performance.

Test 2:

Now we are down to 1.7 seconds load time and the images load much faster, and it seems they are limited more by the client loading them one by one rather than the connection itself.

So here's the Before vs After:

  • Initial HTML page load - 700 ms > 250 ms

  • Overall content size 2.3 MB > 2.2 MB (not much image recompression this time)

  • Overall content load 4.2 s > 1.7 s

And one final test...

I made a change to the page content, pushed to Git.

Now I opened the page URL in a new incognito window every few seconds just to check that the changes eventually appear, as they would for a real user.

The result?

  • Changes appeared almost immediately in incognito windows

  • Didn't appear in a normal window unless I loaded then hit F5 as the content was cached browser-side

  • Took about 20 seconds from pushing in Git to the build finishing in Netlify

So all around, a great result!

If you have any issues or questions let me know, and any feedback on improvements is welcome.

Did this answer your question?