shake - rule finished running but did not produce file: - shake-build-system

I try to use shake to convert some markdonw files to html ("bake"). The markdown files are in a directory "dough" and the html should go to "baked". The goal is to produce the index.html file, which links the other files.
This is my first use of shake!
The conversion works, but at the end the first rule produces the error
`rule finished running but did not produce file:`
The cause is perhaps that the index.html file is produced before (with the second rule). How can I tell the first rule not to expect a result (or force the production again)?
secondary question: how to change the first rule to collect files with extension "md" and "markdown"?
Thank you for the help! Suggestions for improvements are most welcome!
bakedD = "site/baked" -- toFilePath bakedPath
doughD = "site/dough"
shakeWrapped :: IO ()
shakeWrapped = shakeArgs shakeOptions {shakeFiles=bakedD
, shakeVerbosity=Loud
, shakeLint=Just LintBasic
} $
do
want ["index"<.>"html"]
"index"<.>"html" %> \out ->
do
mds <- getDirectoryFiles doughD ["//*.md"]
let htmlFiles = [bakedD </> md -<.> "html" | md <- mds]
need htmlFiles
liftIO $ bakeOneFileIO "baked/index.html"
(bakedD <> "//*.html") %> \out ->
do
let c = dropDirectory1 $ out -<.> "md"
liftIO $ bakeOneFileIO c

The error message notes that you declare the file to produce index.html, but it doesn't produce that file. From a read of your build system, it appears it produces based/index.html? If so, change the want line area to read:
do
want ["baked/index.html"]
"baked/index.html" %> \out ->
Now you are saying at the end of the execution you want to produce a file baked/index.html, and that here is a rule that produces baked/index.html. (If it's really producing site/baked/index.html then adjust appropriately.)
Addressing your second question, mds <- getDirectoryFiles doughD ["//*.md","//*.markdown"] will detect both extensions.
As for style tips, using "index" <.> "html" is not really helping - "index.html" is identical but clearer to read. Other than that, it seems pretty idiomatic.

The issue was that the first rule wants a file, but this file is included (and produced) by the second rule. There is an indication for that problematic case that the \out variable is not used and the production of the index.htm is not required in this rule (as it is included in the second rule). One can take this as an indication that a phony rule would be appropriate and simplify the code:
bakedD = "site/baked" -- toFilePath bakedPath
doughD = "site/dough"
shakeWrapped :: IO ()
shakeWrapped = shakeArgs shakeOptions {shakeFiles=bakedD
, shakeVerbosity=Loud
, shakeLint=Just LintBasic
} $
do
want ["allMarkdownConversion"]
phony "allMarkdownConversion" $
do
mds <- getDirectoryFiles doughD ["//*.md"] -- markdown ext ??
let htmlFiles = [bakedD </> md -<.> "html" | md <- mds]
-- liftIO $ putIOwords ["shakeWrapped - htmlFile", showT htmlFiles]
need htmlFiles
(bakedD <> "//*.html") %> \out ->
do
let c = dropDirectory1 $ out -<.> "md"
liftIO $ bakeOneFileIO c
I think that shake is a very convenient method to add a cache to a static site generator; it rebuilds only what is required!

Related

Why does 'do' (Red language native function) are changing the current execution path?

Introduction
Let's assume that we have a project with structure like that:
project/
├── a.red
│
└───modules/
├── b.red
└── c.red
Source code:
; -------- a.red --------
Red [ File: %a.red ]
b: do %modules/b.red
print b
c: do %modules/c.red
print c
; -------- modules/b.red --------
Red [ File: %b.red ]
return "B module loaded"
; -------- modules/c.red --------
Red [ File: %c.red ]
return "C module loaded"
What I am trying to do is:
load files modules/b.red and modules/c.red into file a.red
print loaded and evaluated content
I am using "do" and "do-file" functions from Red, but unexpectedly they change the current execution path after loading the first file from the modules subdirectory.
>> do %a.red
B module loaded
*** Access Error: cannot open: %modules/c.red
*** Where: read
*** Stack: do-file context do-file
As you can see modules/b.red file was successfully loaded, but modules/c.red not.
Attempts
Take a look at interesting output from the "pwd" function, which displays the current path. After evaluating modules/b.red the path was changed, so I also changed do %modules/c.red to do %c.red to make it work.
; -------- a.red (modified) --------
Red [ File: %a.red ]
print pwd
b: do %modules/b.red
print b
print pwd
c: do %c.red
print c
Execution:
>> do %a.red
%/home/mateusz/Red/project/
B module loaded
%/home/mateusz/Red/project/modules/
C module loaded
The same situation I noticed with equivalent "do-file" Red function.
I was using Red version: 0.6.3.
Questions
Does somebody know why the "do" function are changing the current execution path?
Maybe it is some issue or convention taken from Rebol ?
Any alternative solutions (if they does exists) will also be helpful.
The issue you are facing is that you are calling return at the end of your scripts in the module/ directory. This is circumventing the end of script processing so the current working directory is not getting re-set at the end of the script.
return is only designed to be called from within a function.

How to construct rule names according to an external factor?

Till now i had the following code:
shakeArgsWith opts additionalFlags $ \flags targets -> return $ Just $ do
...
-- Get target arch from command line flags
let givenArch = listToMaybe [v | AFlag v <- flags]
let arch = fromMaybe defArch givenArch
...
-- Set the main target
let outName = masterOutName projType toolchain projName
let mainTgt = (case projType of
Binary _ -> "bin"
Archive -> "lib")
</> prettyShowArch arch
</> show variant
</> outName
-- Set the build directory for the current run
let buildDir = bldDir </> show toolchain </> prettyShowArch arch </> show variant
...
mainTgt %> \out -> do ...
...
buildDir <//> "*.o" %> \out -> do ...
...
Meaning that the names of the rules where constructed according to a command line flag that i was parsing (they contained the arch variable).
So if i gave shake --arch=x64 i was building the main target in bin/x64/Release directory and my intermidiate build files in the tmp/x64/Release folder accordingly.
But now instead of using the command line flag, i want the shared arch variable that is used to construct the rule names to be populated according to the output of some command, for example if i could define some top level action it would be this:
Stdout sout <- quietly $ cmd (EchoStdout False) (EchoStderr False) "gcc -dumpmachine" :: Action (Stdout String)
let foundArch = show (gccTripletToArch sout)
and use the variable foundArch instead of the arch when constructing the mainTgt and buildDir names. Obviously this cannot be done, as even the only top level rule that can be created with the action function returns Rule (). What can i do instead?
I think you should be able to do:
shakeArgsWith opts additionalFlags $ \flags targets -> do
Stdout sout <- cmd (EchoStdout False) (EchoStderr False) "gcc -dumpmachine"
let arch = show (gccTripletToArch sout)
return $ Just $ do
let buildDir = bldDir </> show toolchain </> prettyShowArch arch </> show variant
...
buildDir <//> "*.o" %> \out -> do ...
The target patterns in Shake do have to be statically known, as that ensures some important properties with respect to quick rebuilding (you can guarantee a change in one place has predictable effects). However, you can run commands to determine things like arch, compiler version etc. before you create the build script, and bake them in.
Another viewpoint is that you are dynamically generated a build system based on the arch. Using Shake, as a Haskell EDSL, that is no particular problem, and you arguably were doing that before with the command line.

Filter a list of paths to only include files

If I have a list of FilePaths, how can I filter them to return only the ones that are regular files (namely, not symlinks or directories)?
For example, using getDirectoryContents
main = do
contents <- getDirectoryContents "/foo/bar"
let onlyFiles = filterFunction contents in
print onlyFiles
where "filterFunction" is a function that returns only the FilePaths that represent files.
The answer may just work on Linux, but cross platform support is preferred.
[EDIT] Just using doesDirectoryExist doesn't work as expected. This script prints a list of everything in the directory, not just files:
module Main where
import System.Directory
import Control.Monad (filterM, liftM)
getFiles :: FilePath -> IO [FilePath]
getFiles root = do
contents <- getDirectoryContents root
filesHere <- filterM (liftM not . doesDirectoryExist) contents
subdirs <- filterM doesDirectoryExist contents
return filesHere
main = do
files <- getFiles "/"
print $ files
Additionally, the variable subdirs will only contain "." and "..".
To find standard library functions, Hoogle is a great resource; it's a Haskell search engine that lets you search by type. Using it requires figuring out how to think about types the Haskell Way™, though, which your proposed type signatures doesn't quite work with. So:
You're looking for [Filepath] -> [Filepath]. Remember, the Haskell spelling is FilePath. So…
You're looking for [FilePath] -> [FilePath]. This is unnecessary; if you want to filter things, you should use filter. So…
You're looking for a function of type FilePath -> Bool that you can pass to filter. But this can't quite be right: this function needs to query the file system, which is an effect, and Haskell tracks effects in the type system using IO. So…
You're looking for a function of type FilePath -> IO Bool.
And if we search for that on Hoogle, the first result is doesFileExist :: FilePath -> IO Bool from System.Directory. From the docs:
The operation doesFileExist returns True if the argument file exists and is not a directory, and False otherwise.
So System.Directory.doesFileExist is exactly what you want. (Well… only with a little extra work! See below.)
Now, how do you use it? You can't use filter here, because you have an effectful function. You could use Hoogle again – if filter has the type (a -> Bool) -> [a] -> [a], then annotating the results of the functions with a monad m gives you the new type Monad m => (a -> m Bool) -> [a] -> m [Bool] – but there's an easier "cheap trick". In general, if func is a function with an effectful/monadic version, that effectful/monadic version is called funcM, and it often lives in Control.Monad.¹ And indeed, there is a function Control.Monad.filterM :: Monad m => (a -> m Bool) -> [a] -> m [a].
However! Much as we hate to admit it, even in Haskell, types don't provide all the information you need. Importantly, we're going to have a problem here:
File paths given as arguments to functions are interpreted relative to the current directory, but…
getDirectoryContents returns paths relative to its argument.
Thus, there are two approaches we can take to fix things. The first is to adjust the results of getDirectoryContents so that they can be interpreted correctly. (We also discarding the . and .. results, although if you're just looking for regular files they won't hurt anything.) This will return file names which include the directory whose contents are being examined. The adjust getDirectoryContents function looks like this:
getQualifiedDirectoryContents :: FilePath -> IO [FilePath]
getQualifiedDirectoryContents fp =
map (fp </>) . filter (`notElem` [".",".."]) <$> getDirectoryContents fp
The filter gets rid of the special directories, and the map prepends the argument directory to all the results. This makes the returned files acceptable arguments to doesFileExist. (If you haven't seen them before, (System.FilePath.</>) appends two file paths; and (Control.Applicative.<$>), also available as (Data.Functor.<$>), is an infix synonym for fmap, which is like liftM but more broadly applicable.)
Putting that all together, your final code becomes:
import Control.Applicative
import Control.Monad
import System.FilePath
import System.Directory
getQualifiedDirectoryContents :: FilePath -> IO [FilePath]
getQualifiedDirectoryContents fp =
map (fp </>) . filter (`notElem` [".",".."]) <$> getDirectoryContents fp
main :: IO ()
main = do
contents <- getQualifiedDirectoryContents "/foo/bar"
onlyFiles <- filterM doesFileExist contents
print onlyFiles
Or, if you feel like being fancy/point-free:
import Control.Applicative
import Control.Monad
import System.FilePath
import System.Directory
getQualifiedDirectoryContents :: FilePath -> IO [FilePath]
getQualifiedDirectoryContents fp =
map (fp </>) . filter (`notElem` [".",".."]) <$> getDirectoryContents fp
main :: IO ()
main = print
=<< filterM doesFileExist
=<< getQualifiedDirectoryContents "/foo/bar"
The second approach is to adjust things so that doesFileExist runs with the appropriate current directory. This will return just the file name relative to the directory whose contents are being examined. To do this, we want to use the withCurrentDirectory :: FilePath -> IO a -> IO a function (but see below), and then pass getDirectoryContents the current directory "." argument. The documentation for withCurrentDirectory says (in part):
Run an IO action with the given working directory and restore the original working directory afterwards, even if the given action fails due to an exception.
Putting all this together gives us the following code
import Control.Monad
import System.Directory
main :: IO ()
main = withCurrentDirectory "/foo/bar" $
print =<< filterM doesFileExist =<< getDirectoryContents "."
This is what we want, but unfortunately, it's only available in version 1.3.2.0 of the directory package – as of this writing, the most recent one, and not the one I have. Luckily, it's an easy function to implement; such set-a-value-locally functions are usually implemented in terms of Control.Exception.bracket :: IO a -> (a -> IO b) -> (a -> IO c) -> IO c. The bracket function is run as bracket before after action, and it correctly handles exceptions. So we can define withCurrentDirectory ourselves:
withCurrentDirectory :: FilePath -> IO a -> IO a
withCurrentDirectory fp m =
bracket getCurrentDirectory setCurrentDirectory $ \_ -> do
setCurrentDirectory fp
m
And then use this to get the final code:
import Control.Exception
import Control.Monad
import System.Directory
withCurrentDirectory :: FilePath -> IO a -> IO a
withCurrentDirectory fp m =
bracket getCurrentDirectory setCurrentDirectory $ \_ -> do
setCurrentDirectory fp
m
main :: IO ()
main = withCurrentDirectory "/foo/bar" $
print =<< filterM doesFileExist =<< getDirectoryContents "."
Also, one quick note about lets in dos: in a do block,
do ...foo...
let x = ...bar...
...baz...
is equivalent to
do ...foo...
let x = ...bar... in
do ...baz...
So your example code doesn't need the in in the let and can outdent the print call.
¹ Not always: sometimes you want different classes of effects! Use Applicative from Control.Applicative when possible; more things are Applicatives than are Monads (although this means you can do less with them). In that case, the effectful functions may live there, or also in Data.Foldable or Data.Traversable.
For Unix systems, the package unix exposes these API's:
isRegularFile
isSymbolicLink
isDirectory
You can use the combination of them to achieve what you want. A sample demo of using them in GHCI:
λ> import System.Posix.Files
λ> status <- getFileStatus "/home/sibi"
λ> isDirectory status
True
λ> isRegularFile status
False
You can use a library shelly. It's dedicated to do a shell scripting with Haskell. Here is the solution with shelly:
module Sh where
import Control.Monad
import Data.String
import Shelly
dir = fromString "/home/me"
printAll = mapM_ print
main = do
files <- shelly $ filterM test_f =<< ls dir
printAll files
We use the functions:
ls - for listing the directory content.
ls :: FilePath -> Sh [FilePath]
test_f - for testing if the directory is a file:
test_f :: FilePath -> Sh Bool
shelly - to execute the script:
shelly :: MonadIO m => Sh a -> m a
also we use fromString to create a shelly's FilePath. There is a dedicated type it's not just a string.
I happened to need a way to list only regular files in a directory, and this is how I do it. I thought it might be helpful:
import System.Directory
listFilesInDirectory :: FilePath -> IO [FilePath]
listFilesInDirectory dir = do
rawList <- listDirectory dir
filterM doesFileExist (map (dir </>) rawList)

I18n strategies for Go with App Engine

Not necessarily specific to GAE I suppose, but I'm curious as to what people are using to translate or localise their web applications.
My own approach I'm afraid is hopelessly naive, really just a hand-wave at the issue by loading an entity from the datastore for each package based on a locale value recorded in the user's profile. At least this allows translations of a few strings to be provided:
package foo
...
type Messages struct {
Locale string
ErrorDatastore string
LoginSuccessful string
...
}
Store with a string id corresponding to a locale, then load to Gorilla context or similar:
const Messages ContextKey = iota
...
k := datastore.NewKey(c, "Messages", "en_US", 0, nil)
m := new(Messages)
if err := datastore.Get(c, k, m); err != nil {
...
} else {
context.Set(r, Messages, m)
}
Which is obviously incredibly limited, but at least makes strings available from calling code via context.Get(r, foo.Messages). Can anyone point me at more useful implementations, or suggest a better approach?
Edit (relevant but not completely useful):
gettext: a MO file parser
go-18n
Internationalization plan for Go
Polyglot
Jonathan Chan points out Samuel Stauffer's go-gettext which seems to do the trick. Given the directories:
~appname/
|~app/
| `-app.go
|+github.com/
`-app.yaml
Start with (assumes *nix):
$ cd appname
$ git clone git://github.com/samuel/go-gettext.git github.com/samuel/go-gettext
Source preparation cannot use the _("String to be translated") short form, due to underscore's special characteristics in Go. You can tell xgettext to look for the camelcase function name "GetText" using the -k flag.
Minimal working example:
package app
import (
"fmt"
"log"
"net/http"
"github.com/samuel/go-gettext"
)
func init () {
http.HandleFunc("/", home)
}
func home(w http.ResponseWriter, r *http.Request) {
d, err := gettext.NewDomain("appname", "locale")
if err != nil {
log.Fatal("Failed at NewDomain.")
}
cat := d.GetCatalog("fr_FR")
if cat == gettext.NullCatalog {
log.Fatal("Failed at GetCatalog.")
}
fmt.Fprintf(w, cat.GetText("Yes."))
}
Create the template with:
$ xgettext -d appname -kGetText -s -o appname.pot app/app.go
Note -k, without it there'll be no output as xgettext won't recognise calls to GetText. Edit relevant strings, email etc in appname.pot. Let's assume we're localising for French:
$ mkdir -p locale/fr_FR/LC_MESSAGES
$ msginit -l fr_FR -o french.po -i appname.pot
Edit french.po:
# Appname l10n
# Copyright (C) 2013 Wombat Inc
# This file is distributed under the same license as the appname package.
# Wombat <wombat#example.com>, 2013.
#
msgid ""
msgstr ""
"Project-Id-Version: appname v0.1\n"
"Report-Msgid-Bugs-To: \n"
"POT-Creation-Date: 2013-01-13 11:03+1300\n"
"PO-Revision-Date: 2013-01-13 11:10+1300\n"
"Last-Translator: Rich <rich#example.com>\n"
"Language-Team: French\n"
"Language: fr\n"
"MIME-Version: 1.0\n"
"Content-Type: text/plain; charset=UTF-8\n"
"Content-Transfer-Encoding: 8bit\n"
"Plural-Forms: nplurals=2; plural=(n > 1);\n"
#: app/app.go:15
msgid "Yes."
msgstr "Oui."
Generate the binary (the file that'll actually get deployed with the app):
$ msgfmt -c -v -o locale/fr_FR/LC_MESSAGES/appname.mo french.po
Final directory structure:
~appname/
|~app/
| `-app.go
|~github.com/
| `~samuel/
| `~go-gettext/
| +locale/
| |-catalog.go
| |-domain.go
| `-mo.go
|~locale/
| `~fr_FR/
| `LC_MESSAGES/
| `-appname.mo
`-app.yaml
(locale directory under go-gettext holds test data, could be removed for deployment.)
If all goes well, a visit to appname should display "Oui."
go-i18n is an alternative package with some nice features:
Implements CLDR plural rules.
Uses text/template for strings with variables.
Translation files are simple JSON.
GNU Gettext is widely adopted as a de facto standard for i18n solutions.
To use .po files directly from your Go project and load all translations in memory for better performance, you can use my package: https://github.com/leonelquinteros/gotext
It's fairly simple and directly to the point.
So, given a default.po file (formatted after GNU gettext: https://www.gnu.org/software/gettext/manual/html_node/PO-Files.html) located in /path/to/locales/es_ES/default.po you can load it using this package and start consuming the translations right away:
import "github.com/leonelquinteros/gotext"
func main() {
// Configure package
gotext.SetLibrary("/path/to/locales")
gotext.SetLanguage("es_ES")
// Translate text from default domain
println(gotext.Get("Translate this text"))
}
If you prefer to have the translations defined in a string for a more "focused" use, you can parse a PO formatted string with a Po object:
import "github.com/leonelquinteros/gotext"
func main() {
// Set PO content
str := `
msgid "One apple"
msgstr "Una manzana"
msgid "One orange"
msgstr "Una naranja"
msgid "My name is %s"
msgstr "Mi nombre es %s"
`
// Create Po object
po := new(Po)
po.Parse(str)
// Get a translated string
println(po.Get("One orange"))
// Get a translated string using variables inside the translation
name := "Tom"
println(po.Get("My name is %s", name))
}
As you can see on the last example, it's also possible to use variables inside the translation strings.
While most solutions are pretty much similar, including yours, using a common format as gettext can bring some extra benefits.
Also, your solution doesn't seems to be safe for concurrent use (when consumed from several goroutines). This package handles all that for you. There are also unit tests for the package and contributions are welcome.

What is causing the scaleX method of Imager class to fail?

This is a cross post from Perl Monks and Mahalo answers, where I have not received a satisfactory response yet. Thanks for your time and spirit:
Why do I get this error message from perl:
Can't call method "scaleY" on an undefined value at C:/strawberry/perl +/site/lib/ Image/Seek.pm line 137?
I am getting the error in the title when calling the Image::Seek module from my script. My script is basically a rehash of the module's suggested code.
Here's the error again:
Can't call method "scaleY" on an undefined value at C:/strawberry/perl +/site/lib/ Image/Seek.pm line 137.
Here's my code:
#!/usr/local/bin/perl
use Imager;
use Image::Seek qw(loaddb add_image query_id savedb);
loaddb("haar.db");
my $img = Imager->new("photo-1.jpg")
or die Imager->errstr;
# my $img = Imager->new();
# $img->open(file => "photo-1.jpg")or die Imager->errstr;
add_image($img, 1);
savedb("haar.db");
Here's the section of the Image::Seek module causing the issue:
sub add_image_imager {
my ($img, $id) = #_;
my ($reds, $blues, $greens);
require Imager;
my $thumb = $img->scaleX(pixels => 128)->scaleY(pixels => 128);
for my $y (0..127) {
my #cols = $thumb->getscanline(y => $y);
for (#cols) {
my ($r, $g, $b) = $_->rgba;
$reds .= chr($r); $blues .= chr($b); $greens .= chr($g);
}
}
addImage($id, $reds, $greens, $blues); }
Line 137 is:
my $thumb = $img->scaleX(pixels => 128)->scaleY(pixels => 128);
If I remove
->scaleY(pixels => 128)
then line 129:
my #cols = $thumb->getscanline(y => $y);
gives me essentially the same error.
At this point I'm just trying to add one image to the database. There is an image in the directory where I'm running the script to add the image, named "photo-216.jpg". If I change the name to "photo-1.jpg" or "photo-0.jpg" and change the corresponding "add_image" and "query_id" to respectively 1 or 0, it's the same result.
I do have a database that is 385 KB big that comes from running makedb.pl below, but it is filled with null characters. I renamed this "haar.db". This is the database that gives me the error. If I recreate the haar.db file as an empty one, then the script hangs and after a couple of minutes, it give this different message:
"This application has requested the Runtime to terminate it in an unusual way. Please contact the application's support team for more information."
If there is no "haar.db" the file still gives me the error in this post's title and unlike running makedb.pl, gives me no database named "haar.db".
By the way I get multiple examples of this post's title error also when trying to run this database filling script: http://www.drk7.jp/pub/imgseek/t/makedb.pl.txt/, which I was alluding to before. I obviously removed the .txt extension before trying it. The makedb.pl script is from this Japanese site: http://www.drk7.jp/MT/archives/001258.html.
If I run makedb.pl in a directory of 2423 scanned collectible postage stamps images, I get 362 instances of the error. The 2423 stamps is the number I have after removing the "small" thumbnail versions which I orignally thought might be causing the issue.
Could it be, that some of the images are less than 128 pixels and that is the issue? However if this is true why does the database get filled with null characters?...Unless they are not really null even though the editor I'm using, Notebook++, says they are.
Also note my images are of stamps which are only sometimes perfect squares. Otherwise, sometimes they are "landscape" sometimes "portrait". Maybe the issue is when the "landscape" scaled images get an X axis of 128 pixels and then their Y axis ends up less or much less. Could this be?
Thanks much
Update: Answer completely re-organized.
Image::Seek is not checking if
scaleX returned error. In your case, for some images, scaleX is failing.
You seem to know for which images scaleX is failing. So, leave your current
code aside, and put together a short test script:
#!/usr/bin/perl
use strict;
use warnings;
use Imager;
die "Specify image file name\n" unless #ARGV;
my ($imgfile) = #ARGV;
my $img = Imager->new;
$img->read( file => $imgfile )
or die "Cannot read '$imgfile': ", $img->errstr;
my $x_scaled = $img->scaleX( pixels => 128 )
or die 'scaleX failed: ', $img->errstr;
my $thumb = $x_scaled->scaleY( pixels => 128 )
or die 'scaleY failed: ', $x_scaled->errstr;
__END__
Running this test script, you got the error message:
Cannot read 'photo-1.jpg': format 'jpeg' not supported - formats bmp,
ico, pnm, raw, sgi, tga available for reading
indicating the underlying problem: When you installed Imager via Strawberry
Perl's cpan, the libraries for png, jpg etc were not installed. One
solution is to build those libraries with the gcc compiler provided with
Strawberry Perl.
First, you will need zlib.
C:\Temp\zlib-1.2.3> copy win32\Makefile.gcc Makefile
Set prefix = /strawberry/c/local in the Makefile. Compile. You may have to
manually copy the files zlib.h and zconf.h to
C:\strawberry\c\local\include and zlib1.dll, libz.a and libzdll.a to
C:\strawberry\c\local\lib (I don't know because I do not use Strawberry Perl very often and my Strawberry environment is very neglected.)
Then, get libpng. I used the source archive without config script.
C:\Temp\libpng-1.2.38> copy scripts\makefile.mingw Makefile
C:\Temp\libpng-1.2.38> make prefix=/strawberry/c/local ZLIBLIB=/strawberry/c/local/lib ZLIBINC=/strawberry/c/local/include
This built the PNG library. Again, you may have to manually copy the .dll,
.a and .h files to the appropriate directories. I did because of my less
than perfect Strawberry environment.
Finally, get the JPEG library.
C:\Temp\jpeg-7> copy Makefile.ansi Makefile
Make sure to edit this file and set CC=gcc. Customize jconfig.h according
to the instructions in jconfig.txt. I used jconfig.dj as a basis.
You might also want to set
CFLAGS= -O2
SYSDEPMEM= jmemansi.o
in Makefile, and
#define DEFAULT_MAX_MEM 4*1024*1024
in jconfig.h. After running make, again copy the files as needed (and as explained by install.txt).
Once the libraries are installed, you can
C:\Temp> SET IM_INCPATH=C:\strawberry\c\local\include
C:\Temp> SET IM_LIBPATH=C:\strawberry\c\local\lib
C:\Temp> cpan
cpan> force install Imager
which yields:
gif: includes not found - libraries not found
ungif: includes not found - libraries not found
jpeg: includes found - libraries found
png: includes found - libraries found
tiff: includes not found - libraries not found
freetype2: includes not found - libraries not found
freetype2: not available
T1-fonts: includes not found - libraries not found
TT-fonts: includes not found - libraries not found
w32: includes found - libraries found
If all of this is too much work, it is ... sigh I just realized the
binaries are available at GnuWin32.

Resources