Share » Forums » Developer » Static Cache URLs Point to PHP site

Static Cache URLs Point to PHP site

Static Cache URLs Point to PHP site

Tuesday 29 April 2008 8:51:19 pm - 3 replies

Modified on Tuesday 29 April 2008 8:52:21 pm by Russell Michell

Author Message

André R.

Wednesday 30 April 2008 1:58:50 am

>How do I get them to reference other .html files also in the 'static' directory?

You don't, this is handled by the apache rewrite rules that check if a html file is present, if not the dynamic page is used.

Doc:
http://ez.no/download/ez_publish/changelogs/ez_publish_3_6/new_features/static_caching_of_content

eZ Online Editor 5: http://projects.ez.no/ezoe || eZJSCore (Ajax): http://projects.ez.no/ezjscore || eZ Publish EE http://ez.no/eZPublish/eZ-Publish-Enterprise-Subscription
@: http://twitter.com/andrerom

Russell Michell

Wednesday 30 April 2008 2:40:45 pm

Ah, thanks. Now I understand.

This does seem a little odd to me though. I appreciate what is trying to be done here (ie cache only those areas of the site that need to be, allowing still for some dynamic interaction) but this removes the possibility of being able to 'cache' the <i>entire</i> site.

Plone CMS has had this functionality for about 4 or 5 years and the new kid on the block, here in NZ at least is Silverstripe (www.silverstripe.com) which is having this functionality built into it as I write.

As you can see from my use-case above, I really need to be able to 'cache' ('cook') dynamic content into static HTML, CSS and images in order to deploy an entire site remotely.

I will try and modify the code to deal with the link issue and of course to grab the images. Otherwise I'll have to use a manual 'scrape' or try out another CMS.

The ability to 'cache' was one of the main things that brought me to eZ - but it doesn't seem to work in the same way one might expect it to.

Many thanks for your time.
Russ

Russell Michell, Wellington, New Zealand.
We're building! http://www.theruss.com/blog/
I'm on Twitter: http://twitter.com/therussdotcom

Believe nothing, consider everything.

Russell Michell

Thursday 01 May 2008 7:54:13 pm

OK, for those who encounter this thread in the future, a much-much easier was a shell script. Just make sure you really need the --cut-dirs and --exclude-directories options.

Obviously more info can be had on your unix-like system by typing:

#> man wget

The script:

#!/bin/bash
#Shell script to deploy an eZ Publish website.
# Me, May 1, 2008

# The URL to fetch:
URL=$1
# The dir to pile everything into locally
DIR=$2

# An array of dirs to exclude:
EXC[0]=ezpublish/index.php/eng/Community
EXC[1]=ezpublish/index.php/eng/user

# Build a list of dirs to exclude:
for exclusion in "${EXC[@]}"
        do
        LIST+="$exclusion,"
done

# Remove the last comma: (Operator "%%" means "delete FROM the RIGHT, to the LAST case of what follows.")
LIST=${LIST%,*}

# The command:
CMD="wget -mErp"
CMD="$CMD -nH"
CMD="$CMD --convert-links"
CMD="$CMD --cut-dirs=2"
CMD="$CMD -P$DIR"
CMD="$CMD --exclude-directories=$LIST"
CMD="$CMD $URL"

# Execute
$CMD

Save the file as somethng like: ez-deploy.sh and chmod it such that your system user can execute it:

#> chmod u+x ez-deploy.sh

Invoked:

#> cd path/to/parent-of-ez-deploy.sh
./ez-deploy.sh http://site-todeploy.co.nz/ezpublish/ local-dir-to-download-into

NJoi :-)

Russell Michell, Wellington, New Zealand.
We're building! http://www.theruss.com/blog/
I'm on Twitter: http://twitter.com/therussdotcom

Believe nothing, consider everything.

You must be logged in to post messages in this topic!

36 542 Users on board!

Forums menu