Code Monkey home page Code Monkey logo

dokuwiki-plugin-siteexport's Introduction

Dokuwiki Site Export

Site Export is an admin plugin that offers a convenient way to download your dokuwiki site as a standalone version. It cycles through your namespaces - a starting point can be given at run-time - and packs the result into a zip file. The result can be unpacked at any place and viewed without having an internet connection. That makes the plugin perfect for providing static documentation to customers on a CD or DVD.

Build Status

Requirements

  • DokuWiki version Weatherwax, Binky and newer
  • You need to log in as administrator to have access to the siteexport plugin
  • You have to have the zip compression library of your php installation activated.
  • dw2pdf for PDF export options
  • a writable /inc/preload.php file for template switching

Configuration

This is about the Admin --> Configuration Manager page.

  • Default Export Depth:
    How far in should the export go. This option will be used when selecting "specific depth" as Export Type.
  • Try to export non-public pages:
    SiteExport only allows to export public pages. This option will allow to export non-public pages that the currently logged in user has access too as well. (no yet implemented)
  • Wiki Path and name for exported ZIP file:
    DokuWiki namespace and file name that will be used to create temporary files.
  • Pattern to exclude resources:
    A regular expression to define paths that should not be exported
  • Maximum script execution time:
    Defines an execution time in seconds for how long the script may run while exporting a site via URL Request or wget/curl request. Due to PHP settings this may be very limited and if you export a very large site or namespace the script will time out. This option will take care of redirecting the request as many times as needed until the export is finished for all pages (the time should be long enough to have at least one page exported).
  • Debug Level:
    Level of Debug during export job. This may be important to find errors for the support.
  • Debug File:
    Where will the debug log be written to? It has to be a writable destination.
  • Cache time for export:
    The siteexport uses its own cache timer to determine when an export should be discarded.

How to export pages

SiteExport is only available from the Admin menu at the Additional Plugins section. When starting of from the page you want to export, simply go to the export menu, and hit start.

Enter your starting Namespace

Basic export options

Set Namespace

The namespace/page you actually want to export. This will be predefined with the page you currently visited.

Parent Namespace to export

By default this is the same namespace/page that you are going to export. That will result in a flat structure, with the pages at the top level.

You can define a higher namespace which will result in the structure below being exported with potentially empty folders but having the lib (plugins, template) directories beeing at top level.

This is usefull for exporting translated namespaces starting with the root of the translation.

Export Type

How many pages should be exported?

  • This page only:
    Attemps to only export this one page.
  • All sub namespaces:
    Exports everything below the defined namespace
  • Specific depth:
    Exports everything below the defined namespace but only for the defined depth. The depth means how many namespaces it will go down.
Depth

Number of namespaces to go down into.

Export Linked Pages

Will export linked pages outside or even deeper of the defined namespace as well.

Select your Options

Export Absolute Paths

Export Body only

Adds the option for renderes to only export the inner body other than exporting the whole page.

Export all parameters (e.g. "do")

Adds all parameters to the links in exported pages - which may make sense when using JavaScript that relies on the links

Render Engine

By default the engine of the DokuWiki. This allows exporting the pages with other renderers, e.g. the siteexport_pdf (derived from dw2pdf) to have pages in PDF file format.

Export Template

Only available if inc/preload.php is writable.
Allows to export the pages with a different template than the default one.

PDF Export

Only available if the dw2pdf plugin is installed.
Exports the pages into PDF files, one per page. There are options (TOC) to export multiple pages into one large PDF.

Numbered Headings

Only available if the dw2pdf plugin is installed.
Adds a number to each heading. Usefull for a Table Of Contents inside the PDF.

Select one of the Help Creation Options (optional)

This is totaly optional.

Create Eclipse Help:

Allows the creation of context.xml and map.xml files that can be used by Eclipse and its Plugins.

Create Java Help:

Allows the creation of tox.xml and map.xml files that can be used by Java and the Java Help implementation.

Use TOC file in Namespace

If you do not want the export to be structured like your DokuWiki is, you can create a file called toc in the namespace and create a custom structure that will be used instead.

This is great for having all the chapters of a documentation in their own file and exporting them into PDF as a single file.

See Table Of Contents definition.

Disable (JS/CSS) Plugins while export

The checkboxes stand for each plugin. By checking it the plugin will be disabled temporarily and therefore not generate any CSS or JS output.

This is great for a static export that does not need any other or only some plugins. Be adviced that disabling plugins might improve the speed of PDF export.

Custom Options

Here you can add additional variables that will be given to exported page. This can help to create content dynamically when using other plugins or PHP execution.

Simply hit add Option for a new name / value field, enter the variables name and value. Done.

Start Process

The three links are convenience links. They will be regenerated by every change of any option. They reflect static URLs that can be copied and used e.g. for ant jobs.

Now: Hit start and your pages will be exported.

Status

Reflects what is currently going on and will display errors that occur during exporting or changing options.

Save as Cron Job

If your configuration directory is writable - which it should after setup, you can save your current setup here.

You can show what has been saved, view them, delete them and re-run them.

If you have CLI access (terminal or whatever) and cron access to your server, you can add the cron.php file to schedule runs of your cron jobs.

Table Of Contents definition

If you do not want the export to be structured like your DokuWiki is, you can create a file called toc in the namespace and create a custom structure that will be used instead.

This is great for having all the chapters of a documentation in their own file and exporting them into PDF as a single file.

The structure is basically a list of links:

<toc>
  * [[.:index|Index of the page]]
    * [[.:sub:index|Index of the sub namespace]]
      * [[.:sub:sub:index|Index of the sub/sub namespace]]
    * [[.:sub:page|Page in the sub namespace]]
  * [[.:another-page|Another page]]
    * [[.:another-sub:index|Index of another sub namespace]]
</toc>

The <toc> tag support several options:

Option Behavior
notoc hide the user defined TOC in the document
description display the description abstract below of the linked page below the link (usefull together with:~~META:description abstract=This is my abstract.~~
merge this will merge all the defined documents from the TOC into the current document.
mergeheader this will, as addition to merge, merge all headers starting with the first document (new headers of later documents will be appended at the end, the will not be sorted alphabetically)
mergehint add hints about the merged content. It can be configured using the meta plugin and the key "mergehint = ". It falls back to the "thema" key or the page title.
pagebreak inserts a pagebreak after each page defined by the namespace

You have to define the options like this: <toc notoc merge>

Siteexport

Add a download button for the current page - or any other page

{{siteexport [options]}}

Siteexport Aggregator

There is the additional syntax: aggregator. This allows an in-page selection of an ordered list of pages in the current namespace and sub-namespaces. Once selected and submitted, that page will be generated with the options provided - and merged back up the list (it actually starts merging top down). (What?!)

The Syntax is (and can be used multiple times per document):

{{siteexportAGGREGATOR [options]}}
  • This will actually create a <toc> internally, using the options merge and mergeheader
  • Without options it will generate a dropdown-list of all pages in the namespace (except the current) one
  • The list will be ordered by a meta key mergecompare which has to be added via the META plugin.
  • You can create an element with predefined options using the editor button.
  • There are two additional options:
    • exportSelectedVersionOnly - if set it will only export this one selected entry. It will then export this one page with the metadata of the page that has the aggregator.
    • includeSelectedVersion - will export all documents starting from the newest to the document directly prior to the selected one.
    • mergehint=false - disable the addition of merge hints

dokuwiki-plugin-siteexport's People

Contributors

alainbecker avatar alexgearbox avatar araname avatar bantig1 avatar brice187 avatar dregad avatar fakdid avatar flo22100 avatar gamma avatar klap-in avatar michitux avatar mijndert avatar nex-otaku avatar nfriedli avatar rneej avatar sawachan avatar schplurtz avatar schwarzera avatar scrutinizer-auto-fixer avatar splitbrain avatar trebmuh avatar

Stargazers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

Watchers

 avatar  avatar  avatar  avatar  avatar  avatar  avatar  avatar

dokuwiki-plugin-siteexport's Issues

Plugin Automation?

I'm trying to setup the plugin for automation. I read the README document and it references the three links generated at the bottom of the siteexport page that change based on the options you select. I also noticed the section that mentions the CRON functionality. Could you provide a little clarification on how to use all the mentioned items to automate the plugin?

Permission Denied on all pages

I just installed and was testing this plugin. I was exporting to PDF and had dw2pdf installed. Everything looked like it ran fine and the zip file contained my structure. But every page just says "Permissions Denied". I assumed it was because I was using Google to login. So I logged in directly with my local admin account, re-exported (with cache cleared so it would start from scratch) and got the same results. I'm attaching a screenshot of what every page looks like.

image

Linked pages are not exported

I'm trying to export a full namespace. The pages in it, including its headpage, contain links to pages located outside the namespace, but they are not exported.

I've tried both namespace export options: all subnamespaces and specified depth, with the option Export Linked Pages enabled.

Blank page issue/feature (due to Indexmenu sorting namespaces separately from pages)

We would like to remove all "blank.html" and the "parent namespace" as a filter to the export, so that this does not have to be done by hand. It should possibly be an option with checkbox. We need to keep the blank pages in the wiki.

Remove all "blank.html" and provided there is just that file in the directory, remove the parent namespace .
-"Checkbox"

================
Problem: Blank.html pages that are under an otherwise empty namespace, are being exported. We would like to avoid that if possible.

When I do a search in the wiki search bar for "blank" Result: 45 lines x 4 cols = 180 blank pages.

https://opencpn.org/wiki/dokuwiki/doku.php?do=search&id=blank

I picked one of them and it happened to be

https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual:edit_user_manual:indexmenu_plugion:blank

I opened it and deleted the text "This page intentionally left blank" . When building the wiki this was done for each page that did not sort properly, to create a "blank" sub-page so that indexmenu plugin would be able to sort properly. {This action forces the "page" to become a "namespace" so all pages are actually "namespaces" and thus sorted in the same way by indexmenu.)

https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual:edit_user_manual:indexmenu_plugion:this_page_intentionally_left_blank

Note: If we delete the text in the blank sub-page "This page intentionally left blank", and hit "enter", the sub-page will be removed automatically "this page does not exist" along with the namespace, and the parent page will no longer sort properly. Look at the left navigation index for Indexmenu Plugin down under Editor Manual, under Test Questions. The code at the top of the page says " Indexmenu sort number: 3" or {{indexmenu_n>3}} so this page is intended to be #3 underneath Editor Manual!
When the blank page is removed, the parent page dropped out of the sort order due to the way indexmenu sorts namespaces first and then pages!

So how do we get rid of these 200 instances of blank pages and associated namespaces?

They all should be named with "blank" at the end!!

https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:user_corner:blank

-- So we need to do a Regex search on "blank" pages and delete them in a cache and not in the dokuwiki documents!

  • We also need to delete the parent directory of that "blank.html" no matter what the name is (they all have different names associated with the name of the parent page.

Could this a filter applied just before writing to siteexport.zip on the server ?

Could it have a checkbox under the SiteExport Plugin?

  • Perhaps this already exists in some form....?

We do NOT want to delete the blank pages on the wiki as that will mess up the sort order.

plugin resets error_reporting values

Installing this plugin causes the error_reporting values set in inc/init.php to be reset creating a whole bunch of strict warnings in a common installation.

This is caused by the error_reporting line in cron.php which is included in the action/cron.php file.

SyntaxError: missing ; before statement

Hi,

I get the following error in both in the internet explorer and in the firefox:

1

I have tried to fix ist by adding an semicolon in the script.js:

2

But i think the error is not really fixed, the status panel shows me now many other errors:

3

I've tested it with the Weatherwax and an completely clean Binky RC2 Version.

Namespace Confusion for export

  1. Set Namespace: opencpn:toc <-- OpenCPN User Manual TOC https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:toc flat totally opened TOC created with this code in the page:
    {{indexmenu>:opencpn:opencpn_user_manual#5|msort nsort nojs}}
  2. Parent Namespace to export: opencpn:opencpn_user_manual
  3. Export Type: All sub namespaces
  4. Export Linked Pages: checked
  5. Export all parameters (eg. "do"): checked
  6. Render Engine: ckgedit
  7. Export Template: bootstrap3
  8. Default Lang: EN
  9. Disable JS Plugins: Check
    START

It immediately starts "Adding"
opencpn:developer_manual .....
opencpn:supplementary_software...
opencpn:supplementary_hardware...

Yet those are not in my "Parent namespace to export" !!

images are not exported correctly

siteexport dated 2014-05-21 does not export images correctly in xhtml export. All images in the export are completely the same (only one image is exported on all pages).

If I go back to siteexport dated 2013-11-20 then images are exported correctly.

zip file can't be created.

Hello,

I already installed siteexport plugin and tried to export pages to html, but it seems a zip file couldn't be created.
Zip extension still remains enabled.
I show you my environment and the content of Debug file below, please tell me how to fix this and what the problem is.

- OS : CentOS release 5.11 (Final) - Web Server :Server version: Apache/2.2.3 - Web Access Security : use ".htaccess" files - Dokuwiki version: 2015-08-10 "Detritus" - PHP version :5.3.3 - Zip arcihive Zip => enabled Extension Version => $Id: php_zip.c 300470 2010-06-15 18:48:33Z pajoye $ Zip version => 1.9.1 Libzip version => 0.9.0 - extension_dir extension_dir => /usr/lib64/php/modules/ => /usr/lib64/php/modules/ [2015-09-16 09:46:48 INFO] Calculated the following Cache Hash URL: [2015-09-16 09:46:48 INFO] doku.php/help/search/searchmethod?addParams=1&defaultLang=&depth=0&depthType=0.0&do=siteexport&ens=help%3Asearch%3Asearchmethod&renderer=&template=dokuwiki [2015-09-16 09:46:48 INFO] Export Filename for 'wiki:siteexport.zip' will be: 'wiki:siteexport.auto.1fe1e2284bd4e7610e4ad8ee71d6de2f.zip' [2015-09-16 09:46:48 INFO] Export Filename for 'siteexport.zip' will be: 'siteexport.auto.1fe1e2284bd4e7610e4ad8ee71d6de2f.zip' [2015-09-16 09:46:48 INFO] Checking for Cache, depthType:0 [2015-09-16 09:46:48 INFO] Calculated the following Cache Hash URL: [2015-09-16 09:46:48 INFO] doku.php/help/search/searchmethod?addParams=1&defaultLang=&depth=0&depthType=0.0&do=siteexport&ens=help%3Asearch%3Asearchmethod&renderer=&template=dokuwiki [2015-09-16 09:46:48 INFO] HASH-Pattern for CacheFile: [2015-09-16 09:46:48 INFO] 1fe1e2284bd4e7610e4ad8ee71d6de2f [2015-09-16 09:46:48 INFO] Export Filename for 'wiki:siteexport.zip' will be: 'wiki:siteexport.auto.1fe1e2284bd4e7610e4ad8ee71d6de2f.zip' [2015-09-16 09:46:48 INFO] New CacheFile because the file was over the cachetime: [2015-09-16 09:46:48 INFO] /opt/dokuwiki/data/cache/1/14b0d17d0b00153db222b16113deae8b.siteexport.zip [2015-09-16 09:46:48 INFO] Export Filename for 'wiki:siteexport.zip' will be: 'wiki:siteexport.auto.1fe1e2284bd4e7610e4ad8ee71d6de2f.zip' [2015-09-16 09:46:48 INFO] ======================================== [2015-09-16 09:46:48 INFO] Adding Site: 'help:search:searchmethod' [2015-09-16 09:46:48 INFO] ---------------------------------------- [2015-09-16 09:46:48 INFO] Array [2015-09-16 09:46:48 INFO]([2015-09-16 09:46:48 INFO] [sectok] => f5d9e9721c9f5f3e0b68f1dd1cb3b36e [2015-09-16 09:46:48 INFO] [ns] => help:search:searchmethod [2015-09-16 09:46:48 INFO] [ens] => help:search:searchmethod [2015-09-16 09:46:48 INFO] [depthType] => 0.0 [2015-09-16 09:46:48 INFO] [depth] => 0 [2015-09-16 09:46:48 INFO] [addParams] => 1 [2015-09-16 09:46:48 INFO] [renderer] => [2015-09-16 09:46:48 INFO] [template] => dokuwiki [2015-09-16 09:46:48 INFO] [defaultLang] => [2015-09-16 09:46:48 INFO] [call] => __siteexport_addsite [2015-09-16 09:46:48 INFO] [site] => help:search:searchmethod [2015-09-16 09:46:48 INFO] [pattern] => 1fe1e2284bd4e7610e4ad8ee71d6de2f [2015-09-16 09:46:48 INFO] [base] => /dokuwiki/ [2015-09-16 09:46:48 INFO] [http_credentials] => [2015-09-16 09:46:48 INFO] [u] => [2015-09-16 09:46:48 INFO] [p] => [2015-09-16 09:46:48 INFO]) [2015-09-16 09:46:48 INFO] REQUEST for add_site: [2015-09-16 09:46:48 INFO] Array [2015-09-16 09:46:48 INFO]([2015-09-16 09:46:48 INFO] [defaultLang] => [2015-09-16 09:46:48 INFO] [template] => dokuwiki [2015-09-16 09:46:48 INFO]) [2015-09-16 09:46:48 INFO] 'http://{IP address}/dokuwiki/' [2015-09-16 09:46:48 INFO] Array [2015-09-16 09:46:48 INFO]([2015-09-16 09:46:48 INFO] [0] => /dokuwiki/ [2015-09-16 09:46:48 INFO] [1] => http://{IP address}/dokuwiki/ [2015-09-16 09:46:48 INFO] [2] => /dokuwiki/ [2015-09-16 09:46:48 INFO] [3] => http://{IP address}/dokuwiki/ [2015-09-16 09:46:48 INFO]) [2015-09-16 09:46:48 INFO] internal WL function result: 'http://{IP address}/dokuwiki/doku.php/help/search/searchmethod?defaultLang=&template=dokuwiki' [2015-09-16 09:46:48 INFO] 'http://{IP address}/' [2015-09-16 09:46:48 INFO] Array [2015-09-16 09:46:48 INFO]([2015-09-16 09:46:48 INFO] [0] => /dokuwiki/ [2015-09-16 09:46:48 INFO] [1] => http://{IP address}/dokuwiki/ [2015-09-16 09:46:48 INFO] [2] => /dokuwiki/ [2015-09-16 09:46:48 INFO] [3] => http://{IP address}/ [2015-09-16 09:46:48 INFO]) [2015-09-16 09:46:48 INFO] internal WL function result: 'http://{IP address}/help/search/searchmethod' [2015-09-16 09:46:48 INFO] Filename could be: [2015-09-16 09:46:48 INFO] searchmethod.html [2015-09-16 09:46:48 INFO] Fetching URL: 'http://{IP address}/dokuwiki/doku.php/help/search/searchmethod?defaultLang=&template=dokuwiki' [2015-09-16 09:46:48 WARN] HTTP status was '407' - but I was told to ignore it by the settings. [2015-09-16 09:46:48 WARN] http://{IP address}/dokuwiki/doku.php/help/search/searchmethod?defaultLang=&template=dokuwiki [2015-09-16 09:46:48 INFO] Adding file 'searchmethod.html' to ZIP [2015-09-16 09:46:48 INFO] Calculated the following Cache Hash URL: [2015-09-16 09:46:48 INFO] doku.php/?cache=nocache&debug=2&do=siteexport [2015-09-16 09:46:48 INFO] Export Filename for 'wiki:siteexport.zip' will be: 'wiki:siteexport.auto.9ca44c4ecdd3349a441230e669d3bcc5.zip' [2015-09-16 09:46:48 INFO] Starting to send a file from siteexporter [2015-09-16 09:46:48 INFO] Export Filename for 'wiki:siteexport.zip' will be: 'wiki:siteexport.auto.1fe1e2284bd4e7610e4ad8ee71d6de2f.zip' [2015-09-16 09:46:48 INFO] fetching cached file from pattern '1fe1e2284bd4e7610e4ad8ee71d6de2f' with name '/opt/dokuwiki/data/cache/1/14b0d17d0b00153db222b16113deae8b.siteexport.zip' [2015-09-16 09:46:48 WARN] Event Data Before: [2015-09-16 09:46:48 WARN] Array [2015-09-16 09:46:48 WARN]([2015-09-16 09:46:48 WARN] [media] => wiki:siteexport.zip [2015-09-16 09:46:48 WARN] [file] => /opt/dokuwiki/data/cache/1/14b0d17d0b00153db222b16113deae8b.siteexport.zip [2015-09-16 09:46:48 WARN] [orig] => /opt/dokuwiki/data/media/wiki/siteexport.zip [2015-09-16 09:46:48 WARN] [mime] => application/zip [2015-09-16 09:46:48 WARN] [download] => 1 [2015-09-16 09:46:48 WARN] [cache] => 0 [2015-09-16 09:46:48 WARN] [ext] => zip [2015-09-16 09:46:48 WARN] [width] => 0 [2015-09-16 09:46:48 WARN] [height] => 0 [2015-09-16 09:46:48 WARN] [status] => 200 [2015-09-16 09:46:48 WARN] [statusmessage] => Not Found [2015-09-16 09:46:48 WARN] [ispublic] => 1 [2015-09-16 09:46:48 WARN]) [2015-09-16 09:46:48 WARN] '/opt/dokuwiki/data/cache/1/14b0d17d0b00153db222b16113deae8b.siteexport.zip' does not exist. Checking original ZipFile [2015-09-16 09:46:48 INFO] Export Filename for 'wiki:siteexport.zip' will be: 'wiki:siteexport.auto.1fe1e2284bd4e7610e4ad8ee71d6de2f.zip' [2015-09-16 09:46:48 WARN] The export must have gone wrong. The cached file does not exist. [2015-09-16 09:46:48 WARN] Array [2015-09-16 09:46:48 WARN]([2015-09-16 09:46:48 WARN] [pattern] => 1fe1e2284bd4e7610e4ad8ee71d6de2f [2015-09-16 09:46:48 WARN] [original File] => wiki:siteexport.zip [2015-09-16 09:46:48 WARN] [expected cached file] => /opt/dokuwiki/data/media/wiki/siteexport.auto.1fe1e2284bd4e7610e4ad8ee71d6de2f.zip [2015-09-16 09:46:48 WARN]) [2015-09-16 09:46:48 INFO] had to move another original file over. Did it work? No, it did not. [2015-09-16 09:46:48 INFO] Can't open the zip-file. [2015-09-16 09:46:48 INFO] /opt/dokuwiki/data/cache/1/14b0d17d0b00153db222b16113deae8b.siteexport.zip [2015-09-16 09:46:48 WARN] Event Data After: [2015-09-16 09:46:48 WARN] Array [2015-09-16 09:46:48 WARN] ( [2015-09-16 09:46:48 WARN] [media] => wiki:siteexport.zip [2015-09-16 09:46:48 WARN] [file] => /opt/dokuwiki/data/cache/1/14b0d17d0b00153db222b16113deae8b.siteexport.zip [2015-09-16 09:46:48 WARN] [orig] => /opt/dokuwiki/data/media/wiki/siteexport.zip [2015-09-16 09:46:48 WARN] [mime] => application/zip [2015-09-16 09:46:48 WARN] [download] => 1 [2015-09-16 09:46:48 WARN] [cache] => 0 [2015-09-16 09:46:48 WARN] [ext] => zip [2015-09-16 09:46:48 WARN] [width] => 0 [2015-09-16 09:46:48 WARN] [height] => 0 [2015-09-16 09:46:48 WARN] [status] => 200 [2015-09-16 09:46:48 WARN] [statusmessage] => Not Found [2015-09-16 09:46:48 WARN] [ispublic] => 1

Best regards,
J.Yagasaki

500 Server Error stopped at ( 64 / 247 )

SITEEXPORT PLUGIN

Uses IndexMenu Plugin for Left Navigation. Turned off all Javascript for the download. See settings below

This line stopped making the 'siteexport.zip'
The progress indicator underneath stopped and a 500 Error appeared in Firebug.

Adding "opencpn:opencpn_user_manual:charts:chart_sources" ( 64 / 247 )

Using Firefox 51.0.1 (32-bit) with Firebug 2.0.91 turned on in the menu (yellow).
I also have "Debug checked under ADMIN > CONFIG Then use firefox "File/find" and search on Debug
and check it. (Should be only temporary check, turn off afterwards)

See attached please.
500 Internal Error-w.docx

screenshot 57
screenshot 59

Gerry do you have any suggestions? Thanks.

Exporting pages to Single Page Html with Indexmenu created TOC page

Set Opencpn Manual on the page
https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual:toc
Then goto "Admin" > Additional Plugins > SiteExportManager

Set Namespace: opencpn:opencpn_user_manual:toc
Parent Namespace to export:opencpn:opencpn_user_manual:toc
Export Linked Pages: specified depth
Depth:3 need down to 7
Export Linked Pages: Checked
Render Engine: ckgedit
Export Template: bootstrap3
Numbered Headings: checked
Use TOC file in Namespace: unchecked
Empty Namespaces in TOC: checked
Curl L used for download

SiteExport works to a depth of 3. When I set it on Depth of 4 we get timeout messages.
Sometimes they are on page 20, 30 or 72 of a total of 143 pages. It has never completed.
Should we look for errors in the pages or lengthen the timeout period. (I believe we lengthened it last time too.)
Found this in php.ini
_SERVER["REQUEST_TIME_FLOAT"] | 1515120311.1936
_SERVER["REQUEST_TIME"] | 1515120311

I don't know if the pages have some problem, but we can create large siteexport.zip of 72 mb (160mb expanded) of the 360 pages using dw2pdf that download fine. We prefer the single page html format that uses a browser, and it is easier to edit.

Attached is the result - siteexport.zip.
which only goes one level down apparently. The lower level pages are missing.

Do you have any suggestions for us?
siteexport(8).zip

The User Manual pages we need for the next software v4.8.2 release are here:
https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual

The IndexMenu created TOC , titled "Opencpn Embedded User Manual" here:
https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual:toc

Trouble with links in offline dokuwiki pages

Hello again!
This time i hope that somebody will answer me.
My problem is:
When i do export pages and after this try open links in that offline pages, link path of page to the same page that already open.
e.g. i open html page "start" and this page have link to the page "programms", when i place cursor on link "programs" i see path to the current page "start" instead of path to the "programms"
Result -> i can't open links in my offline pages

siteexport_pdf render engine, images not included

I'm using the plugin to export in PDF the whole wiki, but in place of the images, I get a little square with a red cross. Since when trying to export in PDF with the plugin Dw2Pdf disabled, the plugins asks for it to be enabled, I assume even when trying to export the wiki, and selecting the render engine siteexport_pdf, it still uses Dw2Pdf.

I've tried exporting with the dw2pdf render engine, and the images are present.

Has anyone an idea of why the images are not included, and how to try to fix it ?

Specified Depth setting - What is intended?

Set Namespace: :opencpn:opencpn_user_manual
Parent Namespace to export: :opencpn:opencpn_user_manual
Export Type: Specified Depth
Depth 5

It exports namespaces from
:opencpn:supplementary_hardware
:opencpn:supplementary_software
:opencpn:development_manual
Which is higher up namespace than :opencpn:opencpn_user_manual

Is this correct? I assumed that "Specified Depth would be from the "Set Namespace" to Export.
Have I done something wrong here?

Should I have used "Set namespace :opencpn" ?

Exported Images Missing

I am attempting to export a large portion of our wiki into PDF form. I'm using PDF Export option so siteexport_pdf gets selected as the render engine. When I do that, the images show broken link icons. The image is linked in the following format:

{{ :ns1:ns2:ns3:image.png?800 |}}

I am able to access the images directly by their URL in my browser.

I have also tried replacing the images with a JPG to get rid of any possible issues with the alpha channel and I get the same results.

If I don't use the PDF Export option and just select dw2pdf, then the images work fine. But I prefer the layout of the siteexport_pdf. Any suggestions?

Upgrade fails, Dokuwiki no longer available. Then SiteExport will not install.

Tried upgrading SiteExport from Admin. after updating several others.
Screen went blank and spinner appeared. Waited 10 minutes.
Tried to get back on the website, not available, just a blank screen.
FTP to server, delete bin/plugins/siteexport.
Dokuwiki website then worked.
Tried to install siteexport fresh. It did the same thing and I had to FTP delete.

What do you think the problem is? SiteExport installed before.
See notes at bottom https://www.dokuwiki.org/plugin:siteexport

Plugins used: indexmenu, backup tool, captcha, chkedit, configuration manager, dwedit, extension manager, gallery, info, move, popularity, revert manager, searchindex manager, styling, translation, doku upgrade, user manager, wrap, plus, acl manager, plain auth.

Website: https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn

I would really like to get this working!
I can give you admin access as we have done if needed.

Large Export download works locally, but not online.

In trying to create a large (siteexport.zip=30-25mb) offline user manual
Opencpn_User_Manual
of about 210 pages, the online creates stops at about 110 pages, with no indication blue vertical line scrolling under (110 of 210 pages) or further messages.

Trying to determine the problem. Have turned on debug and fixed errors.

It seems to be a problem with

Cachetime =
plugin»siteexport»cachetime
Cache time for export 60 * 60 * 24 * 40 = 40 days!

Cachetime.txt

Message - "Finished but download failed"

Which does not seem to be extending the actual cachetime being used by the plugin.
Is this possibly a php setting problem?

Also tried extending
plugin»siteexport»max_execution_time
Maximum script execution time 2400*3

========
Does SiteExport use php for downloads? Are there PHP settings that we should be adjusting on the virtual server? (Using the Backup Tool it appears the backup is about 3.5gb, including everything.)

Is the way we handle links super important to complete properly?
IE:

  1. All links to outside namespaces should be External Links with full pathname.?
  2. All links to internal download files should be External Links with full pathname.?

What are the other things that we need to be careful about to have successful completion?

NOTE: Downloading the entire wiki and running it locally, seems to work and complete properly.
Thank you.

https://www.dokuwiki.org/faq:uploadsize#apache_users
http://stackoverflow.com/questions/19509440/increase-the-limit-of-file-download-size-in-document-management-system

siteexport > issue excluding a single page or multiple pages

@gamma

Any chance you could tell me how to exclude a single page or multiple pages when performing a multi-export with the siteexport plugin? When the original owner of the dokuwiki site deployed the server, he didn't organize the namespace at all, so I can't use the namespaces as a way to break up the export jobs. I'm running into issues with a few pages during the export (404 errors for missiing components) which is causing the entire job to fail.

By the way, thank you for the time and energy you have given to make this plugin and provide support. It's very much appreciated.

Cyrillic links in pages don't work

Hi there.
I have a problem with internal links. I have pages that are with Cyrillic names. There is internal links in them redirecting to other Cyrillic named pages. When I export the namespace that include such pages links don't work.
Please give me an idea where in code to search encoding of internal links. Or why doesn't work at all.
Thanks in advanced.

// Example - Structure of pages
namespace:
животни
pages:
риба
toc

page toc content:
[[.:риба]]

When I open toc file and click on link риба browser try to open:
file:///C:/Documents%20and%20Settings/user/Desktop/dokuwiki/export_test/D0_B6_D0_B8_D0_B2_D0_BE_D1_82_D0_BD_D0_B8/D1_80_D0_B8_D0_B1_D0_B0
I found out that url encoding of животни is D0_B6_D0_B8_D0_B2_D0_BE_D1_82_D0_BD_D0_B8 and url encoding of риба is D1_80_D0_B8_D0_B1_D0_B0.

Export of large documents is unsuccessful online.

NOTE: The only way found to get a single continuous and complete document exported is to download the dokuwiki to a local computer and to run the plugin completely locally, due to an excessive number of failures which require "exclusion" of the last page.

Export of large documents is unsuccessful online, (using PHP 5.6) due to excessive "excluded pages"
However when a Local dokuwiki is set up, export of the total document is feasible with it taking approximately 20 minutes for 200 pages. Images land in directories as configured and links work properly.

NOTE: This experience used the commits Gerry recently made for Indexmenu, ---not presently in the master as of 1/25/2018.

Time Settings:

  • PHP max_execution_time 45 minutes
  • Configuration: plugin siteexport cachetime | 60 * 60 * 24 |
  • Configuration: plugin siteexport max_execution_time | 10000 |

See #85 (comment)

If you need to export a large html document, Do it Locally!!!
First download the entire dokuwiki and get it running locally.
There is some interruption to the process which causes 55 pages to be "excluded" out of 200 pages. The plugin has to be run to find each of the "exclusions", then the page has to be "excluded" on the configuration page, then the plugin has to be run again to find the next "exclusion" which is between 20-30 minutes each time! Furthermore when you export those excluded pages and try to expand them back into the current place in the html document, all the images are left in a new folder under the namespace, resulting in image folders all over the place. (Note: The pages do work when expanded into the proper location, and all the links do work.) -Whereas when a single continuous download completes successfully, the image files get saved as originally stored and of course the links all work.

We had ample timeout settings for both settings PHP_max_time and the timeout setting in the Configuration so this is not the problem.

Would using a more recent version of PHP (above 5.6) allow export of large html documents, without having "excluded" pages?

The settings used were:
From the Offline TOC page OR Main user page

  1. Select from the page's right menu, the icon second from the bottom Export SiteExport
  2. Set Namespace= opencpn:opencpn_user_manual:toc_offline_plugins_manual
  3. Parent Namespace to Export= opencpn:opencpn_user_manual:toc_offline_plugins_manual
  4. Select Export all subnamespaces
  5. To begin disable the "Export linked pages" option. Once it processes completely, then Check to have Links in the document.
  6. Export all parameters "do" Check
  7. Render Engine: ckedit
  8. Template: dokuwiki
  9. Everything else is unchecked.
  10. Just hit Start ( without first hitting one of the three "Direct", "Wget" or "Curl")
  11. When clicking the start button it should show you that it will export x/y pages.
  12. Starts out: Adding "opencpn:opencpn_user_manual:advanced_features:network_repeater" ( 29 / 385 )
  13. When the plugin completes you will be prompted to download the siteexport.zip file.

List of Excludes
required to complete processing the document Online for a successful download
Pages that had to be Excluded
Configuration > SiteExport > Excludes (such as this)
Keep this List of Excludes for later!

options_setting:ships display gps_setup_and_status vector_display_tab gallery_boats opencpn:toc opencpn:user_corner opencpn:supplementary_hardware opencpn:supplementary_software opencpn:developer_manual editors toolbar_buttons toolbar_buttons:tides_and_currents toolbar_buttons:route_mark_manager toolbar_buttons:create_route opencpn_user_manual:toolbar_buttons:ais ais:sart plugins:weather:weatherfax radar_overlay_ais:ais_radar_display charts:nv_charts weather:weather_routing weather:climatology utility_plugins:object_search charts:bsb4_charts utility_plugins:launcher sailing_tools:tactics safety:watchdog navigation:dr_dead_reckoning plugins:safety:sar safety:odraw1.4_pi advanced_features:portable_opencpn_v2 radar_overlay_ais:br24_radar navigation:celestial_navigation included_plugins:grib_weather charts:vfkaps options_setting:connections gps_setup_and_statusvector_display_tab .pdf zip 7z edit_user_manual blank blank.txt cruisersforum github

Leaving this in the Excludes field: .pdf zip 7z edit_user_manual blank blank.txt cruisersforum github

Use with plugin:gallery

Thank you for such a great plugin, it is saves me at least a week at the end of each of my case investigations.

The only issue that I am having difficulty with is using siteexport with plugin:gallery. When I export with the html render, I get the webpages and the gallery display with the thumbnails. When I click the thumbnails, the 'source' image is not the high resolution picture that he thumbnail comes from, but the now pixelated thumbnail.

I see that the images are exported as .jpeg when they are originally .jpg - I can see the workaround of copying over the original source images into the new siteexport folder and renaming them.

I would appreciate if there is a simple adjustment that allows the high resolution original images to be exported with the thumbnails from the plugin:gallery.

Thank you again.

Help, i'm not able to export :-(

hello,
i'm sure it is not a buginside the plugin, it is bug before the screen :-)
the following error occurs, if i want to export a wikipage.
Adding "garten:start" ( 1 / 1 )

[2015-02-28 16:44:26 ERROR] Sending request failed with error, HTTP status was '200'.

[2015-02-28 16:44:26 ERROR] http://localhost/dokuwiki/doku.php?id=garten:start&template=vector

[2015-02-28 16:44:26 ERROR] Runtime Error: Creating temporary download file failed for 'http://localhost/dokuwiki/doku.php?id=garten:start&template=vector'. See log for more information.

Here is the apache.log

[Sat Feb 28 16:23:43.305035 2015] [access_compat:error] [pid 3772:tid 1760] [client ::1:53743] AH01797: client denied by server configuration: C:/Users/Held/xampp/htdocs/dokuwiki/data/security.png, referer: http://localhost/dokuwiki/doku.php?id=garten:aussenlampe&do=admin
[Sat Feb 28 16:30:44.643670 2015] [access_compat:error] [pid 3772:tid 1760] [client ::1:53875] AH01797: client denied by server configuration: C:/Users/Held/xampp/htdocs/dokuwiki/data/security.png, referer: http://localhost/dokuwiki/doku.php?id=wiki:ebook&do=admin
[Sat Feb 28 16:30:44.966688 2015] [access_compat:error] [pid 3772:tid 1760] [client ::1:53875] AH01797: client denied by server configuration: C:/Users/Held/xampp/htdocs/dokuwiki/data/security.png, referer: http://localhost/dokuwiki/doku.php?id=wiki:ebook&do=admin
[Sat Feb 28 16:42:04.775199 2015] [access_compat:error] [pid 3772:tid 1760] [client ::1:54004] AH01797: client denied by server configuration: C:/Users/Held/xampp/htdocs/dokuwiki/data/security.png, referer: http://localhost/dokuwiki/doku.php?id=garten:planung_blauer_garten&do=admin
[Sat Feb 28 16:43:36.014990 2015] [access_compat:error] [pid 3772:tid 1744] [client ::1:54015] AH01797: client denied by server configuration: C:/Users/Held/xampp/htdocs/dokuwiki/data/security.png, referer: http://localhost/dokuwiki/doku.php?id=garten:start&do=admin
[Sat Feb 28 16:43:36.904046 2015] [access_compat:error] [pid 3772:tid 1744] [client ::1:54015] AH01797: client denied by server configuration: C:/Users/Held/xampp/htdocs/dokuwiki/data/security.png, referer: http://localhost/dokuwiki/doku.php?id=garten:start&do=admin

I think, that something is wrong inside my apache config, but i'm not sure.

Any help will be appreciated

Regards
Poul

Using xampp on win8.1 with latest dokuwiki release.

All links refer to the current page only

The export of the pages (more than 200) works well (all the pages are downloaded with their content) ... but the hyperlinks on every page all refer to this page:

<!-- EDIT1 SECTION "Übersicht" [12-1247] -->
<h2 class="sectionedit2" id="bedienungsanleitung">Bedienungsanleitung</h2>
<div class="level2">
<ul>
<li class="level1"><div class="li"> Teil 1: <a href="../de/start.html" class="wikilink1" title="de:create:start">Den Fragebogen gestalten</a></div>
</li>
<li class="level1"><div class="li"> Teil 2: <a href="../de/start.html" class="wikilink1" title="de:survey:start">Vorbereitung und Start der Befragung</a></div>
</li>
<li class="level1"><div class="li"> Teil 3: <a href="../de/start.html" class="wikilink1" title="de:results:start">Erhobene Daten und Dokumentation</a></div>
</li>
<li class="level1"><div class="li"> Teil 4: <a href="../de/start.html" class="wikilink1" title="de:general">Allgemeine technische Informationen</a></div>
</li>
</ul>

In the online version (https://www.soscisurvey.de/help/doku.php/de:start) all the links work fine.

Is this possibly an interaction with the "Translation Plugin"?

No error reported when zip compression is missing.

I know that the documentation states "You have to have the zip compression library of your php installation activated".
However, I got no feedback that I had missed this and the output just looked like is hung somewhere. No error of any kind was reported. Even though I posted on the forum for help, I got none there.
I did get help and learned that was my problem.

If there is a way to check this during execution, then an error should be reported.
If not, two other possibilities exist:

  1. You might try providing a command for checking this.
  2. You could provide documentation supplying expected status output and what is means.

All I got was a single line saying "Added..." and nothing more. Turning on Debug provided no more useful informaiotn.

After enabling siteexport plugin, warning messages appear at the top of every wiki page

Hi all,

First of all, thanks for providing the html export options through this plugin, I think it will be very useful! However, after installing the siteexport plugin, I get extraneous text along the top of the pages. I also noticed that the Trace feature (the breadcrumbs list of pages I recently visited) is not working (it just shows the current page).

See below for a screen shot.

I did have to do one extra manual step to get the plugin to show up without error messages.

After installing it (through the Extension Manager page on the Admin page) I had to manually create a file and make it writable:

touch ~/Sites/dokuwiki/inc/preload.php
chmod 777 ~/Sites/dokuwiki/inc/preload.php

If you have any suggestion as to how to remove these warning messages and recover the Trace feature, please let me know!

Thanks,
~Lina

screen shot 2016-08-23 at 10 30 59 am

Export not working, not paths for,,,

Start Process
Direct Download Link:
wget Download URL:
curl Download URL:

It used to have those paths. What should I do? Have uninstalled and reinstalled.
It completed before with paths in the fields above, but did not give me the full namespace and down...
I would appreciate a more complete description of how to set this up so that it works.

Thanks

Status: "Finished but download failed"

Hi all,

I just installed the siteexport plugin and tried to export just a single page to HTML. Unfortunately, I got the message "Finished by download failed"

I poked around in the dokuwiki directories and saw that there are a few zip files that were created:

cd ~/Sites/dokuwiki/data/media/wiki

ls -la
total 24648
drwxr-xr-x  41 _www  staff    1394 Aug 23 10:04 .
drwxr-xr-x   7 _www  staff     238 Aug 31  2015 ..
-rw-r--r--@  1 _www  staff   33615 Mar 19  2015 dokuwiki-128.png
-rw-r--r--   1 _www  staff  320273 Aug 23 10:03 siteexport.auto.0ad7287ce4ce1f5b4636ccfd4fd2f42c.zip
-rw-r--r--   1 _www  staff  320288 Aug 23 10:04 siteexport.auto.109b1b9cba4ec825d2ac3b327e8d2453.zip
-rw-r--r--   1 _www  staff  320534 Aug 23 10:04 siteexport.auto.19cd9b20a4dc5f0485a3bef095e86661.zip

However, when I unzip one of them, it doesn't seem to contain the HTML file I was hoping for, just one file called "siteexport_preload-_register_template.html" and a folder called "lib". The html file does not contain the page I tried to export.

My wiki version is Release 2016-06-26a "Elenor of Tsort".

There are a few error messages in the Apache error file, but I am not sure if this is the right place to look:

[Tue Aug 23 09:55:56.948628 2016] [access_compat:error] [pid 93800] [client ::1:50630] AH01797: client denied by server configuration: /Users/lfaller/Sites/dokuwiki/data/security.png, referer: http://localhost/~lfaller/dokuwiki/doku.php?id=wiki:welcome&do=admin
[Tue Aug 23 09:56:36.782283 2016] [access_compat:error] [pid 396] [client ::1:50688] AH01797: client denied by server configuration: /Users/lfaller/Sites/dokuwiki/data/security.png, referer: http://localhost/~lfaller/dokuwiki/doku.php?id=wiki:welcome&do=admin
[Tue Aug 23 09:58:37.571085 2016] [access_compat:error] [pid 384] [client ::1:50759] AH01797: client denied by server configuration: /Users/lfaller/Sites/dokuwiki/data/security.png, referer: http://localhost/~lfaller/dokuwiki/doku.php?id=wiki:welcome&do=admin
[Tue Aug 23 10:00:37.175374 2016] [access_compat:error] [pid 396] [client ::1:50852] AH01797: client denied by server configuration: /Users/lfaller/Sites/dokuwiki/data/security.png, referer: http://localhost/~lfaller/dokuwiki/doku.php?id=wiki:welcome&do=admin
[Tue Aug 23 10:02:13.762132 2016] [access_compat:error] [pid 241] [client ::1:50905] AH01797: client denied by server configuration: /Users/lfaller/Sites/dokuwiki/data/security.png, referer: http://localhost/~lfaller/dokuwiki/doku.php?id=wiki:welcome&do=admin
[Tue Aug 23 10:15:24.105272 2016] [access_compat:error] [pid 396] [client ::1:51195] AH01797: client denied by server configuration: /Users/lfaller/Sites/dokuwiki/data/security.png, referer: http://localhost/~lfaller/dokuwiki/doku.php?id=wiki:welcome&do=admin
[Tue Aug 23 10:15:24.897481 2016] [access_compat:error] [pid 396] [client ::1:51195] AH01797: client denied by server configuration: /Users/lfaller/Sites/dokuwiki/data/security.png, referer: http://localhost/~lfaller/dokuwiki/doku.php?id=wiki:welcome&do=admin

Is there another log file I could dig up that might provide information as to how to fix this issue?

Thanks for any advice!

OpenCPN.org - Suggested Documentation & Example (almost working to completion)

Set Namespace: opencpn:opencpn_user_manual:toc <- Point directly to the TOC Page.
Set Parent Namespace: opencpn:opencpn_user_manual <- Point directly to the User Manual namespace.

Here is a documentation page for the OpenCPN.org "Books" Indexmenu Plugin showing how to build a
TOC page for a namespace that can be used in SiteExport.

https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual:edit_user_manual:test_navigation_toc#siteexport_pluginopencpn_user_manual_toc

Here is a documentation page for the OpenCPN.org "Books" SiteExport Plugin showing how to use the TOC page in SiteExport and what configuration settings we use.
https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual:edit_user_manual:site_export_plugin

There is also information about setting up the debug.txt file in case the plugin is not working.

We are still trying to get it to complete, however.

Distilled from Issues #69 #70 #71 #72 #73 and #74

Exclusions: Lib files - Why are these here?

After Export the siteexport.zip contains:
..\lib\exe\css.php.t.bootstrap3.css
..\lib\exe\js.php.bootstrap3.js
..\lib\exe\opensearch.html

What do these files do?
Why are they exported?
Is the css used by the exported html files?
How do we exclude these files?
I don't believe the plugin has access to these files.

Thanks very much for your help.

All Sub namespaces - Not focused on the necessary pages!

This selection does not do what it is intended to do. The Command is totally misdirected and a huge waste of time. The result output is not what is requested or what is considered "all sub namespaces"

Just the TOC Offline Charts page itself.

direct download link

https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual:toc_offline_user_manual_charts&addParams=1&depth=5&depthType=0.0&do=siteexport&ens=opencpn%3Aopencpn_user_manual%3Atoc_offline_user_manual_charts&exportLinkedPages=1&renderer=ckgedit&template=dokuwiki

wget --max-redirect=4 --output-document=siteexport.zip --post-data="id=opencpn:opencpn_user_manual:toc_offline_user_manual_charts&addParams=1&depth=5&depthType=0.0&do=siteexport&ens=opencpn%3Aopencpn_user_manual%3Atoc_offline_user_manual_charts&exportLinkedPages=1&renderer=ckgedit&template=dokuwiki" https://opencpn.org/wiki/dokuwiki/doku.php?id=doku.php --http-user=USER --http-passwd=PASSWD

curl -L --max-redirs 4 -o siteexport.zip -d "id=opencpn:opencpn_user_manual:toc_offline_user_manual_charts&addParams=1&depth=5&depthType=0.0&do=siteexport&ens=opencpn%3Aopencpn_user_manual%3Atoc_offline_user_manual_charts&exportLinkedPages=1&renderer=ckgedit&template=dokuwiki" https://opencpn.org/wiki/dokuwiki/doku.php?id=doku.php --anyauth --user USER:PASSWD

=====================
Offline TOC Charts (just 5 pages)

If I pick "all sub namespaces"

Adding "opencpn:opencpn_user_manual:advanced_features:inland_ecdis" ( 24 / 385 )

This is nuts.

* is exported to Unordered List Item

siteexport fails to export the following correctly to html:

  • TEST 1
  • TEST 2

The above is exported to the following in html:

  • Unordered List ItemTEST 1
  • Unordered List ItemTEST 2

Not enable to export

Hi!

I'm trying to export my dokuwiki. After i push "start" button i receive that:

[2015-08-26 22:04:07 ERROR] Sending request failed with error, HTTP status was '-100'.
[2015-08-26 22:04:07 ERROR] http://my-server/dokuwiki/doku.php/testenglish?template=dokuwiki
[2015-08-26 22:04:07 ERROR] Runtime Error: Creating temporary download file failed for 'http://my-server/dokuwiki/doku.php/testenglish?template=dokuwiki'. See log for more information.

Here the log file

[2015-08-26 22:04:04 INFO] Calculated the following Cache Hash URL:
[2015-08-26 22:04:04 INFO] doku.php/testenglish?addParams=1&depth=0&depthType=0.0&do=siteexport&ens=testenglish&renderer=&template=dokuwiki
[2015-08-26 22:04:04 INFO] Export Filename for 'start:siteexport' will be: 'auto.2b2c9ef8bc9c8995548fb8f6de851571.start:siteexport'
[2015-08-26 22:04:04 INFO] Prepared URL and POST from Request:
[2015-08-26 22:04:04 INFO] Array
[2015-08-26 22:04:04 INFO]([2015-08-26 22:04:04 INFO] [sectok] => 11dd48cb52cd20d495b22c43e77e719b
[2015-08-26 22:04:04 INFO] [ns] => testenglish
[2015-08-26 22:04:04 INFO] [ens] => testenglish
[2015-08-26 22:04:04 INFO] [depthType] => 0.0
[2015-08-26 22:04:04 INFO] [depth] => 0
[2015-08-26 22:04:04 INFO] [addParams] => 1
[2015-08-26 22:04:04 INFO] [renderer] =>
[2015-08-26 22:04:04 INFO] [template] => dokuwiki
[2015-08-26 22:04:04 INFO] [call] => __siteexport_generateurl
[2015-08-26 22:04:04 INFO] [http_credentials] =>
[2015-08-26 22:04:04 INFO] [u] =>
[2015-08-26 22:04:04 INFO] [p] =>
[2015-08-26 22:04:04 INFO])
[2015-08-26 22:04:04 INFO] Prepared URL and POST data:
[2015-08-26 22:04:04 INFO] Array
[2015-08-26 22:04:04 INFO]([2015-08-26 22:04:04 INFO] [0] => http://my-server/dokuwiki/doku.php/testenglish?addParams=1&depth=0&depthType=0.0&do=siteexport&ens=testenglish&renderer=&template=dokuwiki
[2015-08-26 22:04:04 INFO] [1] => doku.php/testenglish?addParams=1&depth=0&depthType=0.0&do=siteexport&ens=testenglish&renderer=&template=dokuwiki
[2015-08-26 22:04:04 INFO] [2] => doku.php/testenglish
[2015-08-26 22:04:04 INFO] [3] => addParams=1&depth=0&depthType=0.0&do=siteexport&ens=testenglish&renderer=&template=dokuwiki
[2015-08-26 22:04:04 INFO])
[2015-08-26 22:04:04 INFO] Checking for Cache, depthType:0
[2015-08-26 22:04:04 INFO] Generating Direct Download URL
[2015-08-26 22:04:04 INFO] http://my-server/dokuwiki/doku.php/testenglish?addParams=1&depth=0&depthType=0.0&do=siteexport&ens=testenglish&renderer=&template=dokuwiki
[2015-08-26 22:04:07 INFO] Calculated the following Cache Hash URL:
[2015-08-26 22:04:07 INFO] doku.php/testenglish?addParams=1&depth=0&depthType=0.0&do=siteexport&ens=testenglish&renderer=&template=dokuwiki
[2015-08-26 22:04:07 INFO] Export Filename for 'start:siteexport' will be: 'auto.2b2c9ef8bc9c8995548fb8f6de851571.start:siteexport'
[2015-08-26 22:04:07 INFO] Export Filename for 'siteexport' will be: 'auto.2b2c9ef8bc9c8995548fb8f6de851571.siteexport'
[2015-08-26 22:04:07 INFO] Checking for Cache, depthType:0
[2015-08-26 22:04:07 INFO] Calculated the following Cache Hash URL:
[2015-08-26 22:04:07 INFO] doku.php/testenglish?addParams=1&depth=0&depthType=0.0&do=siteexport&ens=testenglish&renderer=&template=dokuwiki
[2015-08-26 22:04:07 INFO] HASH-Pattern for CacheFile:
[2015-08-26 22:04:07 INFO] 2b2c9ef8bc9c8995548fb8f6de851571
[2015-08-26 22:04:07 INFO] Export Filename for 'start:siteexport' will be: 'auto.2b2c9ef8bc9c8995548fb8f6de851571.start:siteexport'
[2015-08-26 22:04:07 INFO] New CacheFile because the file was over the cachetime:
[2015-08-26 22:04:07 INFO] /var/www/dokuwiki/data/cache/e/e3bc5fa747aeac08adbd3b744e1f7ac7.siteexport
[2015-08-26 22:04:07 INFO] Export Filename for 'start:siteexport' will be: 'auto.2b2c9ef8bc9c8995548fb8f6de851571.start:siteexport'
[2015-08-26 22:04:07 INFO] ========================================
[2015-08-26 22:04:07 INFO] Adding Site: 'testenglish'
[2015-08-26 22:04:07 INFO] ----------------------------------------
[2015-08-26 22:04:07 INFO] Array
[2015-08-26 22:04:07 INFO]([2015-08-26 22:04:07 INFO] [sectok] => 11dd48cb52cd20d495b22c43e77e719b
[2015-08-26 22:04:07 INFO] [ns] => testenglish
[2015-08-26 22:04:07 INFO] [ens] => testenglish
[2015-08-26 22:04:07 INFO] [depthType] => 0.0
[2015-08-26 22:04:07 INFO] [depth] => 0
[2015-08-26 22:04:07 INFO] [addParams] => 1
[2015-08-26 22:04:07 INFO] [renderer] =>
[2015-08-26 22:04:07 INFO] [template] => dokuwiki
[2015-08-26 22:04:07 INFO] [call] => __siteexport_addsite
[2015-08-26 22:04:07 INFO] [site] => testenglish
[2015-08-26 22:04:07 INFO] [pattern] => 2b2c9ef8bc9c8995548fb8f6de851571
[2015-08-26 22:04:07 INFO] [base] => /dokuwiki/
[2015-08-26 22:04:07 INFO] [http_credentials] =>
[2015-08-26 22:04:07 INFO] [u] =>
[2015-08-26 22:04:07 INFO] [p] =>
[2015-08-26 22:04:07 INFO])
[2015-08-26 22:04:07 INFO] REQUEST for add_site:
[2015-08-26 22:04:07 INFO] Array
[2015-08-26 22:04:07 INFO]([2015-08-26 22:04:07 INFO] [template] => dokuwiki
[2015-08-26 22:04:07 INFO])
[2015-08-26 22:04:07 INFO] 'http://my-server/dokuwiki/'
[2015-08-26 22:04:07 INFO] Array
[2015-08-26 22:04:07 INFO]([2015-08-26 22:04:07 INFO] [0] => /dokuwiki/
[2015-08-26 22:04:07 INFO] [1] => http://my-server/dokuwiki/
[2015-08-26 22:04:07 INFO] [2] => /dokuwiki/
[2015-08-26 22:04:07 INFO] [3] => http://my-server/dokuwiki/
[2015-08-26 22:04:07 INFO])
[2015-08-26 22:04:07 INFO] internal WL function result: 'http://my-server/dokuwiki/doku.php/testenglish?template=dokuwiki'
[2015-08-26 22:04:07 INFO] 'http://my-server/'
[2015-08-26 22:04:07 INFO] Array
[2015-08-26 22:04:07 INFO]([2015-08-26 22:04:07 INFO] [0] => /dokuwiki/
[2015-08-26 22:04:07 INFO] [1] => http://my-server/dokuwiki/
[2015-08-26 22:04:07 INFO] [2] => /dokuwiki/
[2015-08-26 22:04:07 INFO] [3] => http://my-server/
[2015-08-26 22:04:07 INFO])
[2015-08-26 22:04:07 INFO] internal WL function result: 'http://my-server/testenglish'
[2015-08-26 22:04:07 INFO] Filename could be:
[2015-08-26 22:04:07 INFO] testenglish.html
[2015-08-26 22:04:07 INFO] Fetching URL: 'http://my-server/dokuwiki/doku.php/testenglish?template=dokuwiki'
[2015-08-26 22:04:07 ERROR] Sending request failed with error, HTTP status was '-100'.
[2015-08-26 22:04:07 ERROR] http://my-server/dokuwiki/doku.php/testenglish?template=dokuwiki
[2015-08-26 22:04:07 ERROR] Runtime Error: Creating temporary download file failed for 'http://my-server/dokuwiki/doku.php/testenglish?template=dokuwiki'. See log for more information.

Best Regards!
Valentine

Anchor/Bookmark Html Links are not made at the anchor.

Gamma,

I feel badly about posting, because SiteExport is an important and valuable tool for OpenCPN releases: it creates the Html documentation. The dust has settled now on release of OpenCPN v4.6 and the Help file has been created with manual piecing together, but it worked, referencing an IndexMenu Plugin TOC page.

We've found that Internal Anchor/Bookmark Html Links are not made at the Anchor location. This is probably caused by the fact that Dokuwiki just uses the Headings as automatic anchor links, which is very handy in Dokuwiki, but when we export we need to have those links actually created in order for any Internal or External link with an Anchor/Bookmark to work.

Perhaps there is a setting we have missed?
Thank you for making SiteExport!

An example


Site Export result:

'<a href="#ais_operating_controls" title="opencpn:opencpn_user_manual:toolbar_buttons:ais ?" class="wikilink1">AIS Operating Controls</a>'

`<h1>AIS Operating Controls</h1>`

Yet HTML requires at the Anchor/Bookmark location:

`<h1>AIS Operating Controls</h1>`

`<a href="#ais_operating_controls">Can be left empty</a>`

OR BETTER

`<h1  id="ais_operating_controls">AIS Operating Controls</h1>`

OR

`<h1 name="ais_operating_controls">AIS Operating Controls</h1>`

Same problem is occurring to External Page Anchor/Bookmark or "jump to" Links for the same reason.

Example from W3C
https://www.w3schools.com/html/html_links.asp

Create an anchor/bookmark with id attribute at the location (Not done by SiteExport).

`<h2 id="C4">Chapter 4</h2>`

Then add a link to the anchor/bookmark from within the same page (OK)

`<a href="#C4">Jump to Chapter 4</a>`

OR
Add a link to the anchor/bookmark from another page

`<a  href="html_demo.html#C4">Jump to Chapter 4</a>`

Differences in id and name attributes according to WC3
Because of its specification in the HTML DTD, the name attribute may contain character references. Thus, the value Dürst is a valid name attribute value, as is Dürst . The id attribute, on the other hand, may not contain character references.
The "name" attribute allows richer anchor names (with entities).
https://www.w3.org/TR/html4/struct/links.html#h-12.2.1

OpenCPN.org export

Working on OpenCPN.org export...

  1. Where is the debug log file located? I specify opencpn:debug2.txt, with log level "info". I cannot find the file.

  2. What is the exact syntax of the "exclusion" field in the preferences? I try ".gov", to exclude references off-site, and this does not seem to work.

Thanks
Dave

Authentication via AD/LDAP

Authentication against IIS with disabled anonymous logins does not work. The http client does not support any other authentication than Basic. It would be required to either try forwarding an existing cookie, have the browser do the authentication dance or implement it ourself.

siteexport breaks indexer?

It lookslike siteexport make dokuwikis indexing mechanism unusable. I didn't had tim to troubleshoot inside of sit export (yet) - if I disable siteexport it works fine again.

it can be easily reproduced by hiting the indexer via cli:

root@wiki:/var/www# php5 bin/indexer.php 
PHP Fatal error:  Call to a member function process_event() on a non-object in /var/www/wiki.be.nts.ch/inc/events.php on line 70

dokuwiki-2015-08-10a

Direct Download Link does not do the same as Start button

Hi

I tried to use the "Direct Download Link" but it does not export the same content as when clicking on the Start button (dokuwiki is hosted on windows 2012 R2).

The Direct Download Link is:

http://www.server.com/wiki/doku.php?id=sup&addParams=1&depth=0&depthType=1.0&do=siteexport&ens=sup&exportBody=1&template=dokuwiki

(the problem here is that exportBody seems to be ignored)

Clicking on the Start button sends this in http post:

sectok=9286fd6d588a60b1cd20b871ffdb2326&ns=sup&ens=sup&depthType=1.0&depth=0&exportBody=1&addParams=1&renderer=xhtml&template=dokuwiki&call=__siteexport_getsitelist

(here exportBody is respected correctly)

siteexport.zip is not the same. I think that the Direct Download Link is not working correctly.

Error in line 665 in inc/functions.php

Since some hours, I get an error on my logs:
PHP Fatal error: Can't use method return value in write context in /srv/www/linux/dokuwiki/lib/plugins/siteexport/inc/functions.php on line 665

In my file, this line reads:
if ( count($baseParts) == $originalBasePartsCount && $existingPageID != null && !empty( $this->getConf("offSiteLinkTemplate") ) ) {

I'm running DokuWiki on PHP 5.3.17 on SLES 11 SP4. The plugin was updated first in August, 2015, and updated last "Thu, 31 Mar 2016 01:07:02 +0000". The errors start 2 minutes later :(

URL Problems

But I think there is another issue causing the errors above; there is an "&amp" instead of & in the fetched url ???
Some lines of the log:

2013-11-24 22_13_19-

Finished - download failed. Please check your settings.

Hi,

now I've got the following output in the status panel : Finished - download failed. Please check your settings.

Before the last issue #2 was fixed I've got an temporary zip-file in media/wiki ...now there is nothing...

Some lines of the Log:

2013-11-25 21_21_18-

Exlusion of various things:

Exporting to html using Render Engine: CKGedit

*.pdf
It should be possible to exclude ".pdf" downloads, but I cannot come up with the correct REGEX expression. It gets tangled up with the dw2pdf plugin, somehow. Still fussing with this.

[ns]:blank and the parent ns if it is empty.
so we can exclude them

Also any empty [ns]

pre in headers etc.

Full searchable index developed from Indexmenu Plugin

When I use siteexport of the "fullindex.txt" page
---Later note I've deleted this page and have revised it to be a fulll TOC that is flat and open
https://opencpn.org/wiki/dokuwiki/doku.php?id=opencpn:opencpn_user_manual:toc
I put it under the namespace because that appears to be what you prefer by the forms.

which has this indexmenu code:
==== Indexmenu Navigation TOC ====
{{indexmenu>:opencpn:opencpn_user_manual#1|js#drktheme navbfar msort nsort}}
Later note: Code changed to:
[[:opencpn:opencpn_user_manual]] <-- Have to add this parent page somehow or is that automatic?
{{indexmenu>:opencpn:opencpn_user_manual#5|msort nsort nojs}}
--See bottom notes now.

===== Alternative Indexmenu =====
{{indexmenu>. | notoc nojs}}

I get this linked page output when using siteexport and cgkedit format.
full-indexmen

We would like to have the fully expanded menu in the output, with links to all the pages so that the index and pages are searchable. Is there a way to do this?


READ indexmenu > TOC.txt (similar to the pipe command Dos DIR > List.txt0

Last night I realized that everything needed is in the Indexmenu we've setup. The only thing missing is a reader that will read it line by line and copy it to the end of a file. (Similar to the dos command "DIR > List.txt" ) This would be something done in PHP code but could be very simple, which would complete with a TOC file that SiteExport would read and make the html document.

To get the expanded searchable linked TOC at the head of the document, we would add a page similar to "fullindex.txt" and paste in the TOC created.


Delete or not include "Blank" pages and associated Namespaces
There is one other small thing that would be helpful.

Do not include blank pages and associated namespaces [check]

To get indexmenu to sort properly whenever we have a page with no associated namespace, we added a "blank" page underneath and entered a line "This page left blank intentionally." That gets indexmenu to sort properly because it is then just sorting namespaces.

We will have to delete these pages and namespaces in the html manually, but if there were some way not to include those namespaces and blank pages, it would be great. I am trying to figure out how to do that. Perhaps all the pages have to really be "blank" with no text!

Thank you Gerry

Recommend Projects

  • React photo React

    A declarative, efficient, and flexible JavaScript library for building user interfaces.

  • Vue.js photo Vue.js

    🖖 Vue.js is a progressive, incrementally-adoptable JavaScript framework for building UI on the web.

  • Typescript photo Typescript

    TypeScript is a superset of JavaScript that compiles to clean JavaScript output.

  • TensorFlow photo TensorFlow

    An Open Source Machine Learning Framework for Everyone

  • Django photo Django

    The Web framework for perfectionists with deadlines.

  • D3 photo D3

    Bring data to life with SVG, Canvas and HTML. 📊📈🎉

Recommend Topics

  • javascript

    JavaScript (JS) is a lightweight interpreted programming language with first-class functions.

  • web

    Some thing interesting about web. New door for the world.

  • server

    A server is a program made to process requests and deliver data to clients.

  • Machine learning

    Machine learning is a way of modeling and interpreting data that allows a piece of software to respond intelligently.

  • Game

    Some thing interesting about game, make everyone happy.

Recommend Org

  • Facebook photo Facebook

    We are working to build community through open source technology. NB: members must have two-factor auth.

  • Microsoft photo Microsoft

    Open source projects and samples from Microsoft.

  • Google photo Google

    Google ❤️ Open Source for everyone.

  • D3 photo D3

    Data-Driven Documents codes.