[RULE] Re: Restricted requirements for RULE website

M. Fioretti m.fioretti at inwind.it
Sat Mar 13 20:38:07 EET 2004

On Tue, Mar 09, 2004 11:53:20 AM +0100, C David Rigby  cdrigby at 9online.fr  wrote:

> In summary, all CMS packages seem to require modification and/or
> extension to meet the goals as Marco outlined previously.  Active
> CMS projects may well be willing to help us achieve this goal, if
> that seems interesting to them. Rodolfo's security concerns are
> well-founded, and we need to satisfy his criteria before we throw an
> interactive site on his server.

I've been thinking to the alternative solution below. Please let me
know what you think.

General: giving up mysql for plain text files only is possible but
doesn't scale well, afaik. Once we hopefully have 150/200 pages or sw
packages how do you search them by date, category, whatever? Much more
cumbersome without a database, or maybe not?

Now the proposal. Whenever I or other registered website contributors
want to upload software map entries, news or content to the website:

	 1) We do all changes/new files on our PC. Future web pages
	 have .txt or .php extension, future news and software map
	 entries have .nws or .map extension and a rigidly defined
	 format. All plain ASCII stuff. Future web pages are in a
	 directory structure mirroring the one online (/, /docs,

	 2) A tarball is made with all these new files

	 3) The tarball is uploaded on the server via normal ftp with
	 password or html form via https.

	 4) The tarball is validated inserting its checksum, or
	 digital signature in the form

	 5) Cron Shell scripts on the server (some already existing)
	 convert content to html format, update the dynamic map, 
	 and/or the relevant news/software map mysql entries

	 6) The scripts also send email to the list to inform about
	 new content and update RSS file.

The main differences of this wrt any traditional CMS system are that:

    a) it saves all the admittedly little content we already have
    b) unlike anything PHP/MySQL, we already have some scripts, and I
    am sure I can do the rest of points 5/6 *myself*, without much
    effort or having to read 2/3 more books just for this task.

The main point here is still security, of course, in two parts:

    a) somebody else than me has to describe/figure out/set up points
    3 and 4 above in a safe manner

    b) We have to guarantee that the scripts which parse the tarball
    content cannot damage the website even when that content is
    malicious or simply badly written. Yes, this in principle is
    nothing different that if we had made everything through php forms
    only, but shell scripting I can do myself and submit to peer
    review: that's maybe the main difference

The last missing piece is roles: how to make other (REGISTERED!)
contributors upload stuff in such a way that I am informed, can review
it and authorize publication. I also have ideas about how to implement
this, but I'd rather to know from you if all the above makes sense

Please do come back with lots of critiques and comments!!

	Marco Fioretti
Marco Fioretti                 mfioretti
Red Hat for low memory         www.rule-project.org

There are tasks that cannot be done by more than ten men, or less than
one hundred

Original home page of the RULE project: www.rule-project.org
Original Rule Development Site http://savannah.gnu.org/projects/rule/
Original RULE mailing list: Rule-list at nongnu.org, hosted at http://mail.nongnu.org/mailman/listinfo/rule-list

This full static mirror of the Run Up to Date Linux Everywhere Project mailing list, originally hosted at http://lists.hellug.gr/mailman/listinfo/rule-list, is kept online by Free Software popularizer, researcher and trainer Marco Fioretti. To know how you can support this archive, and Marco's work in general, please click here