This commit was manufactured by cvs2svn to create tag

'helma_1_2_rc1'.
This commit is contained in:
hns 2002-12-06 18:17:34 +00:00
parent 817395e8b0
commit 4ff57fb2ac
14 changed files with 195 additions and 367 deletions

View file

@ -1,97 +1,93 @@
This is the README file for version 1.2 of Helma Object Publisher. This is the README file for version 1.2 of Helma Object Publisher.
=========== ============================
ABOUT HELMA ABOUT HELMA OBJECT PUBLISHER
=========== ============================
Helma is a scriptable platform for creating dynamic, database backed Helma Object Publisher is a web application server.
web sites.
Helma provides an easy way to map relational database tables to objects. With Helma Object Publisher (sometimes simply refered to as Helma or
These objects are wrapped with a layer of scripts and skins that allow Hop) you can define Objects and map them to a relational database
them to be presented and manipulated over the web. The clue here is that table. These so-called HopObjects can be created, modified and deleted
both functions and skins work in an object oriented manner and force using a comfortable object/container model. Hence, no manual fiddling
a clear separation between content, functionality and presentation. around with database code is necessary.
Actions are special functions that are callable over the web. Macros are
special functions that expose functionality to the presentation layer. HopObjects are extended JavaScript objects which can be scripted using
Skins are pieces of layout that do not contain any application logic, server-side JavaScript. Beyond the common JavaScript features, Helma
only macro tags as placeholders for parts that are dynamically provided provides special "skin" and template functionalities which facilitate
by the application. the rendering of objects via a web interface.
Thanks to Helma's relational database mapping technology, HopObjects
create a hierarchical structure, the Url space of a Helma site. The
parts between slashes in a Helma Url represent HopObjects (similar to
the document tree in static sites). The Helma Url space can be thought
of as an analogy to the Document Object Model (Dom) in client-side
JavaScript.
In short, Helma provides a one stop framework to create web applications
with less code and in shorter time than most of the other software out
there.
=================== ===================
SYSTEM REQUIREMENTS SYSTEM REQUIREMENTS
=================== ===================
You need a Java virtual machine 1.3 or higher to run Helma. Windows: 1) On Windows Helma won't run with Microsoft's version of
Java (jview). You can get a compatible Java runtime from Sun or IBM:
http://java.sun.com/j2se/1.3/jre/download-windows.html
http://www.ibm.com/java/jdk/download/ 2) In the Windows start script
I've hardcoded c:\java\lib\classes.zip in the CLASSPATH argument. Not
good.
For Windows, Linux and Solaris you can get a Java runtime or development Macintosh: 1) If you are using the Mac version you should own a G3 CPU
kit from http://java.sun.com/j2se/downloads.html. and/or have MRJ 2.2 (http://www.apple.com/java) installed. Other
platforms have less frustration potential. The Mac OS version
currently is not up-to-date (version 0.1).
If you are on Mac OS X, you already have a Java runtime that will work Linux: The recomended virtual machine for running Helma on Linux is
well with Helma. Blackdown's port of JDK 1.2.2 RC4
(http://www.blackdown.org/java-linux/mirrors.html). JDK 1.1.7v3 will
work, but much slower and show a lot of CPU activity even when the
Helma is idle. IBM's version of JDK 1.1.8 also works well, but this
JVM has some problems of its own with thread handling.
Unfortunately, there is no Java 2 interpreter for Mac OS Classic, so
you can't use Helma on Mac OS 9.
============================ ============================
INSTALLING AND RUNNING HELMA INSTALLING AND RUNNING HELMA
============================ ============================
Simply unzip or untar the contents of the archive file into any place Simply unzip the contents of the archive file into any place on your
on your hard disk. Start Helma by invoking hop.bat or hop.sh from the hard disk. Start Helma by opening the file hop.bat or hop.sh,
command line, depending on whether you are on Windows or respectively.
Linux/Unix/MacOSX. If the java command is not found, try setting the
JAVA_HOME variable in the start script to the location of your Java
installation.
You may also want to have a look at the start script for other settings.
You can adjust server wide settings in the server.properties file. For
example, you should set the smtp property to the name of the SMTP server
that Helma should use to send Email. Applications can be started or
stopped by editing the apps.properties file through the web interface
using the Management application that is part of Helma.
If you manage to get it running you should be able to connect your If you manage to get it running you should be able to connect your
browser to http://localhost:8080/ or http://127.0.0.1:8080/ browser to http://127.0.0.1:8080/ (port 8080, that is).
(port 8080 on the local machine, that is).
Helma comes with a version of Jetty, a lightweight yet industrial strenth This version is set up to use its own embedded Web server and a very
web server developed by Mortbay Consulting. See http://jetty.mortbay.com/ basic embedded object database. For this reason it is able to run
for more information. While Jetty works well for deploying real web sites, virtually without installation on any platform with a Java 1.1 virtual
you may want to run Helma behind an existing web server. This is most machine.
easily done by running Helma with the AJPv13 listener which allows you to
plug Helma into any web server using the Apache mod_jk module. See
http://jakarta.apache.org/tomcat/tomcat-4.1-doc/jk2/index.html for more
information on mod_jk and AJPv13.
Finally, Helma can be plugged into Servlet containers using Servlet On the other hand, the embedded Web server and object db are meant for
classes that communicate with Helma either directly or via Java RMI. development work and not ready for prime time deployment. For that
(Be warned that these options may be harder to set up and maintain though, you'd probably use an external relational database, the Berkeley DB
since most of the recent development efforts have been geared towards the package and a full featured Web server like Apache.
mod_jk/AJPv13 setup.)
===================================== =====================================
DOCUMENTATION AND FURTHER INFORMATION DOCUMENTATION AND FURTHER INFORMATION
===================================== =====================================
Currently, documentation-in-progress is available online at Currently, a documentation-in-progress is available online only.
http://helma.org/. We know that it sucks and hope to do some substantial Please refer to http://helma.org/docs/.
improvments within the coming weeks and months.
Your input is highly welcome. There is a mailing-list to discuss Helma at For further information http://helma.org generally is a good place.
http://helma.org/lists/listinfo/hop. Don't hesitate to voice any questions, There is also a mailing-list about Helma-related stuff available at
proposals, complaints, praise you may have on the list. We know we have http://helma.org/lists/listinfo/hop.
a lot to do and to learn, and we're open to suggestions.
For questions, comments or suggestions feel free to contact
tobi@helma.at.
For questions, comments or suggestions also feel free to contact
hannes@helma.at.
-- --
Last modified on December 5, 2002 by Hannes Wallnoefer <hannes@helma.at> This document was last modified on Friday 22 June 2001 by
tobi@helma.at

View file

@ -25,7 +25,7 @@ On Unix systems open a terminal window, change to the Antclick
directory and type ./hop.sh. directory and type ./hop.sh.
If you manage to get it running you should be able to connect your If you manage to get it running you should be able to connect your
browser to http://127.0.0.1:8080/ (port 8080, that is). Now you can browser to http://127.0.0.1:8080/ (port 8080, that is). Now you can
set up and configure your antville site. set up and configure your antville site.
@ -33,111 +33,89 @@ set up and configure your antville site.
ABOUT ANTVILLE ABOUT ANTVILLE
============== ==============
Antville is an open source project aimed to the development of an Antville is an open source project aimed to the development of an
"easy to maintain and use" weblog-hosting system. It is not limited "easy to maintain and use" weblog-hosting system. It is not limited
to just one weblog, it can easily host up to several hundred or to just one weblog, it can easily host up to several hundred or
thousand weblogs (the number of weblogs is more limited by the site thousand weblogs (the number of weblogs is more limited by the site
owner's choice and server power than software limitations). owner's choice and server power than software limitations).
Antville is entirely written in JavaScript and based on the Helma Antville is entirely written in JavaScript and based on the Helma
Object Publisher, a powerful and fast scriptable open source web Object Publisher, a powerful and fast scriptable open source web
application server (which itself is written in Java). Antville works application server (which itself is written in Java). Antville works
with a relational database in the backend. with a relational database in the backend.
Check out http://project.antville.org/ for more information. ============================
ABOUT HELMA OBJECT PUBLISHER
============================
=========== Helma Object Publisher is a web application server.
ABOUT HELMA
===========
Helma is a scriptable platform for creating dynamic, database backed With Helma Object Publisher (sometimes simply refered to as Helma or
web sites. Hop) you can define Objects and map them to a relational database
table. These so-called HopObjects can be created, modified and deleted
using a comfortable object/container model. Hence, no manual fiddling
around with database code is necessary.
Helma provides an easy way to map relational database tables to objects. HopObjects are extended JavaScript objects which can be scripted using
These objects are wrapped with a layer of scripts and skins that allow server-side JavaScript. Beyond the common JavaScript features, Helma
them to be presented and manipulated over the web. The clue here is that provides special "skin" and template functionalities which facilitate
both functions and skins work in an object oriented manner and force the rendering of objects via a web interface.
a clear separation between content, functionality and presentation.
Actions are special functions that are callable over the web. Macros are Thanks to Helma's relational database mapping technology, HopObjects
special functions that expose functionality to the presentation layer. create a hierarchical structure, the Url space of a Helma site. The
Skins are pieces of layout that do not contain any application logic, parts between slashes in a Helma Url represent HopObjects (similar to
only macro tags as placeholders for parts that are dynamically provided the document tree in static sites). The Helma Url space can be thought
by the application. of as an analogy to the Document Object Model (Dom) in client-side
JavaScript.
In short, Helma provides a one stop framework to create web applications
with less code and in shorter time than most of the other software out
there.
=================== ===================
SYSTEM REQUIREMENTS SYSTEM REQUIREMENTS
=================== ===================
You need a Java virtual machine 1.3 or higher to run Helma. You need Java 2 runtime version 1.3 or higher to run Helma. Helma has
been used successfully on Windows, Linux and Mac OS X platforms.
For Windows, Linux and Solaris you can get a Java runtime or development
kit from http://java.sun.com/j2se/downloads.html.
If you are on Mac OS X, you already have a Java runtime that will work
well with Helma.
Unfortunately, there is no Java 2 interpreter for Mac OS Classic, so
you can't use Helma on Mac OS 9.
============================ ============================
INSTALLING AND RUNNING HELMA INSTALLING AND RUNNING HELMA
============================ ============================
Simply unzip or untar the contents of the archive file into any place Simply unzip the contents of the archive file into any place on your
on your hard disk. Start Helma by invoking hop.bat or hop.sh from the hard disk. Start Helma by opening the file hop.bat or hop.sh,
command line, depending on whether you are on Windows or respectively.
Linux/Unix/MacOSX. If the java command is not found, try setting the
JAVA_HOME variable in the start script to the location of your Java
installation.
You may also want to have a look at the start script for other settings.
You can adjust server wide settings in the server.properties file. For
example, you should set the smtp property to the name of the SMTP server
that Helma should use to send Email. Applications can be started or
stopped by editing the apps.properties file through the web interface
using the Management application that is part of Helma.
If you manage to get it running you should be able to connect your If you manage to get it running you should be able to connect your
browser to http://localhost:8080/ or http://127.0.0.1:8080/ browser to http://127.0.0.1:8080/ (port 8080, that is).
(port 8080 on the local machine, that is).
Helma comes with a version of Jetty, a lightweight yet industrial strenth This version is set up to use its own embedded Web server and a very
web server developed by Mortbay Consulting. See http://jetty.mortbay.com/ basic embedded object database. For this reason it is able to run
for more information. While Jetty works well for deploying real web sites, virtually without installation on any platform with a Java 1.1 virtual
you may want to run Helma behind an existing web server. This is most machine.
easily done by running Helma with the AJPv13 listener which allows you to
plug Helma into any web server using the Apache mod_jk module. See
http://jakarta.apache.org/tomcat/tomcat-4.1-doc/jk2/index.html for more
information on mod_jk and AJPv13.
Finally, Helma can be plugged into Servlet containers using Servlet On the other hand, the embedded Web server and object db are meant for
classes that communicate with Helma either directly or via Java RMI. development work and not ready for prime time deployment. For that
(Be warned that these options may be harder to set up and maintain though, you'd probably use an external relational database, the Berkeley DB
since most of the recent development efforts have been geared towards the package and a full featured Web server like Apache.
mod_jk/AJPv13 setup.)
===================================== =====================================
DOCUMENTATION AND FURTHER INFORMATION DOCUMENTATION AND FURTHER INFORMATION
===================================== =====================================
Currently, documentation-in-progress is available online at Currently, a documentation-in-progress is available online only.
http://helma.org/. We know that it sucks and hope to do some substantial Please refer to http://helma.org/.
improvments within the coming weeks and months.
Your input is highly welcome. There is a mailing-list to discuss Helma at For further information http://helma.org generally is a good place.
http://helma.org/lists/listinfo/hop. Don't hesitate to voice any questions, There is also a mailing-list about Helma-related stuff available at
proposals, complaints, praise you may have on the list. We know we have http://helma.org/lists/listinfo/hop.
a lot to do and to learn, and we're open to suggestions.
For questions, comments or suggestions feel free to contact
tobi@helma.at.
For questions, comments or suggestions also feel free to contact
antville@helma.org.
-- --
Last modified on December 5, 2002 by Hannes Wallnoefer <hannes@helma.at> This document was last modified on Friday 25 October 2002 by
hannes@helma.at

View file

@ -35,7 +35,7 @@ rem set JAVA_OPTIONS=-server -Xmx128m
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
:: Setting the script path :: Setting the script path
set INSTALL_DIR=%~d0%~p0 set SCRIPT_DIR=%~d0%~p0
:: Using JAVA_HOME variable if defined. Otherwise, :: Using JAVA_HOME variable if defined. Otherwise,
:: Java executable must be contained in PATH variable :: Java executable must be contained in PATH variable
@ -48,7 +48,7 @@ if "%JAVA_HOME%"=="" goto default
:: Setting HOP_HOME to script path if undefined :: Setting HOP_HOME to script path if undefined
if "%HOP_HOME%"=="" ( if "%HOP_HOME%"=="" (
set HOP_HOME=%INSTALL_DIR% set HOP_HOME=%SCRIPT_DIR%
) )
cd %HOP_HOME% cd %HOP_HOME%
@ -76,4 +76,4 @@ if not "%HOP_HOME%"=="" (
) )
:: Invoking the Java virtual machine :: Invoking the Java virtual machine
%JAVACMD% %JAVA_OPTIONS% -jar "%INSTALL_DIR%\launcher.jar" %OPTIONS% %JAVACMD% %JAVA_OPTIONS% -jar launcher.jar %OPTIONS%

View file

@ -1,7 +1,7 @@
#!/bin/sh #!/bin/sh
# Shell script for starting Helma with a JDK-like virtual machine. # Shell script for starting Helma with a JDK-like virtual machine.
# To add JAR files to the classpath, simply place them into the # To add JAR files to the classpath, simply place them into the
# lib/ext directory. # lib/ext directory.
# uncomment to set JAVA_HOME variable # uncomment to set JAVA_HOME variable
@ -32,11 +32,6 @@ else
JAVACMD=java JAVACMD=java
fi fi
# Get the Helma installation directory
INSTALL_DIR="${0%/*}"
cd $INSTALL_DIR
INSTALL_DIR=$PWD
# get HOP_HOME variable if it isn't set # get HOP_HOME variable if it isn't set
if test -z "$HOP_HOME"; then if test -z "$HOP_HOME"; then
# try to get HOP_HOME from script file and pwd # try to get HOP_HOME from script file and pwd
@ -69,5 +64,5 @@ if [ "$HOP_HOME" ]; then
SWITCHES="$SWITCHES -h $HOP_HOME" SWITCHES="$SWITCHES -h $HOP_HOME"
fi fi
# Invoke the Java VM # Invoking the Java VM
$JAVACMD $JAVA_OPTIONS -jar "$INSTALL_DIR/launcher.jar" $SWITCHES $JAVACMD $JAVA_OPTIONS -jar launcher.jar $SWITCHES

View file

@ -8,7 +8,7 @@
<target name="init"> <target name="init">
<property name="Name" value="helma"/> <property name="Name" value="helma"/>
<property name="year" value="1998-${year}"/> <property name="year" value="1998-${year}"/>
<property name="version" value="1.2-rc2"/> <property name="version" value="1.2-rc1"/>
<property name="project" value="helma"/> <property name="project" value="helma"/>
<property name="build.compiler" value="classic"/> <property name="build.compiler" value="classic"/>
@ -28,7 +28,7 @@
<property name="jar.name" value="${project}"/> <property name="jar.name" value="${project}"/>
<property name="package.name" value="${project}-${version}"/> <property name="package.name" value="${project}-${version}"/>
<property name="antclick.name" value="antclick-1.0pre3"/> <property name="antclick.name" value="antclick-1.0pre2"/>
<property name="debug" value="on"/> <property name="debug" value="on"/>
<property name="optimize" value="on"/> <property name="optimize" value="on"/>
@ -196,13 +196,10 @@
<!-- copy the launcher jar file --> <!-- copy the launcher jar file -->
<copy file="${home.dir}/launcher.jar" todir="${build.work}/"/> <copy file="${home.dir}/launcher.jar" todir="${build.work}/"/>
<!-- copy README.txt -->
<copy file="${home.dir}/README.txt" todir="${build.work}/"/>
<!-- copy the whole docs-directory --> <!-- copy the whole docs-directory -->
<!-- copy todir="${build.work}/docs"> <copy todir="${build.work}/docs">
<fileset dir="${build.docs}"/> <fileset dir="${build.docs}"/>
</copy --> </copy>
<!-- copy all libraries except helma-YYYYMMDD.jar --> <!-- copy all libraries except helma-YYYYMMDD.jar -->
<copy todir="${build.work}/lib"> <copy todir="${build.work}/lib">

View file

@ -35,7 +35,7 @@ rem set JAVA_OPTIONS=-server -Xmx128m
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
:: Setting the script path :: Setting the script path
set INSTALL_DIR=%~d0%~p0 set SCRIPT_DIR=%~d0%~p0
:: Using JAVA_HOME variable if defined. Otherwise, :: Using JAVA_HOME variable if defined. Otherwise,
:: Java executable must be contained in PATH variable :: Java executable must be contained in PATH variable
@ -48,7 +48,7 @@ if "%JAVA_HOME%"=="" goto default
:: Setting HOP_HOME to script path if undefined :: Setting HOP_HOME to script path if undefined
if "%HOP_HOME%"=="" ( if "%HOP_HOME%"=="" (
set HOP_HOME=%INSTALL_DIR% set HOP_HOME=%SCRIPT_DIR%
) )
cd %HOP_HOME% cd %HOP_HOME%
@ -76,4 +76,4 @@ if not "%HOP_HOME%"=="" (
) )
:: Invoking the Java virtual machine :: Invoking the Java virtual machine
%JAVACMD% %JAVA_OPTIONS% -jar "%INSTALL_DIR%\launcher.jar" %OPTIONS% %JAVACMD% %JAVA_OPTIONS% -jar launcher.jar %OPTIONS%

11
hop.sh
View file

@ -1,7 +1,7 @@
#!/bin/sh #!/bin/sh
# Shell script for starting Helma with a JDK-like virtual machine. # Shell script for starting Helma with a JDK-like virtual machine.
# To add JAR files to the classpath, simply place them into the # To add JAR files to the classpath, simply place them into the
# lib/ext directory. # lib/ext directory.
# uncomment to set JAVA_HOME variable # uncomment to set JAVA_HOME variable
@ -32,11 +32,6 @@ else
JAVACMD=java JAVACMD=java
fi fi
# Get the Helma installation directory
INSTALL_DIR="${0%/*}"
cd $INSTALL_DIR
INSTALL_DIR=$PWD
# get HOP_HOME variable if it isn't set # get HOP_HOME variable if it isn't set
if test -z "$HOP_HOME"; then if test -z "$HOP_HOME"; then
# try to get HOP_HOME from script file and pwd # try to get HOP_HOME from script file and pwd
@ -69,5 +64,5 @@ if [ "$HOP_HOME" ]; then
SWITCHES="$SWITCHES -h $HOP_HOME" SWITCHES="$SWITCHES -h $HOP_HOME"
fi fi
# Invoke the Java VM # Invoking the Java VM
$JAVACMD $JAVA_OPTIONS -jar "$INSTALL_DIR/launcher.jar" $SWITCHES $JAVACMD $JAVA_OPTIONS -jar launcher.jar $SWITCHES

Binary file not shown.

View file

@ -168,13 +168,7 @@ public class ClassInfo {
if (indexedReadMethod != null && indexedReadMethod.getParameterTypes().length != 1) { if (indexedReadMethod != null && indexedReadMethod.getParameterTypes().length != 1) {
throw new ProgrammingError("Indexed getter of property ' " + propertyName + "' should have 1 parameter!"); throw new ProgrammingError("Indexed getter of property ' " + propertyName + "' should have 1 parameter!");
} }
// Work around reflection bug, Hannes Wallnoefer 11/2002
if (indexedReadMethod != null && Modifier.isPublic (indexedReadMethod.getModifiers ()))
indexedReadMethod.setAccessible (true);
if (indexedWriteMethod != null) { if (indexedWriteMethod != null) {
// Work around reflection bug, Hannes Wallnoefer 11/2002
if (Modifier.isPublic (indexedWriteMethod.getModifiers ()))
indexedWriteMethod.setAccessible (true);
Class [] paramCls = indexedWriteMethod.getParameterTypes(); Class [] paramCls = indexedWriteMethod.getParameterTypes();
if (paramCls == null || paramCls.length != 2) { if (paramCls == null || paramCls.length != 2) {
throw new ProgrammingError("Indexed setter of property ' " + propertyName + "' should have 2 parameter!"); throw new ProgrammingError("Indexed setter of property ' " + propertyName + "' should have 2 parameter!");
@ -194,13 +188,7 @@ public class ClassInfo {
if (readMethod != null && readMethod.getParameterTypes().length != 0) { if (readMethod != null && readMethod.getParameterTypes().length != 0) {
throw new ProgrammingError("Non indexed getter of indxed property ' " + propertyName + "' is not supposed to have a parameter!"); throw new ProgrammingError("Non indexed getter of indxed property ' " + propertyName + "' is not supposed to have a parameter!");
} }
// Work around reflection bug, Hannes Wallnoefer 11/2002
if (readMethod != null && Modifier.isPublic (readMethod.getModifiers ()))
readMethod.setAccessible (true);
if (writeMethod != null) { if (writeMethod != null) {
// Work around reflection bug, Hannes Wallnoefer 11/2002
if (Modifier.isPublic (writeMethod.getModifiers ()))
writeMethod.setAccessible (true);
Class [] paramCls = writeMethod.getParameterTypes(); Class [] paramCls = writeMethod.getParameterTypes();
if (paramCls == null || paramCls.length != 1) { if (paramCls == null || paramCls.length != 1) {
throw new ProgrammingError("Non indexed setter of indexed property ' " + propertyName + "' should have 1 parameter!"); throw new ProgrammingError("Non indexed setter of indexed property ' " + propertyName + "' should have 1 parameter!");
@ -220,13 +208,7 @@ public class ClassInfo {
if (readMethod != null && readMethod.getParameterTypes().length != 0) { if (readMethod != null && readMethod.getParameterTypes().length != 0) {
throw new ProgrammingError("Non indexed getter of property ' " + propertyName + "' is not supposed to have a parameter!"); throw new ProgrammingError("Non indexed getter of property ' " + propertyName + "' is not supposed to have a parameter!");
} }
// Work around reflection bug, Hannes Wallnoefer 11/2002
if (readMethod != null && Modifier.isPublic (readMethod.getModifiers ()))
readMethod.setAccessible (true);
if (writeMethod != null) { if (writeMethod != null) {
// Work around reflection bug, Hannes Wallnoefer 11/2002
if (Modifier.isPublic (writeMethod.getModifiers ()))
writeMethod.setAccessible (true);
Class [] paramCls = writeMethod.getParameterTypes(); Class [] paramCls = writeMethod.getParameterTypes();
if (paramCls == null || paramCls.length != 1) { if (paramCls == null || paramCls.length != 1) {
throw new ProgrammingError("Non indexed setter of property ' " + propertyName + "' should have 1 parameter!"); throw new ProgrammingError("Non indexed setter of property ' " + propertyName + "' should have 1 parameter!");
@ -236,8 +218,8 @@ public class ClassInfo {
} }
} }
} }
// Add to cache // Add to cache
if (debug) System.out.println("** property '" + propertyName + "' + found, add to cache"); if (debug) System.out.println("** property '" + propertyName + "' + found, add to cache");
if (beanProperties==null) { if (beanProperties==null) {
beanProperties = new Hashtable(); beanProperties = new Hashtable();
@ -248,7 +230,7 @@ public class ClassInfo {
} }
return descriptor; return descriptor;
} }
/** /**
* Get the list of public method in this class or superclass, by name (the * Get the list of public method in this class or superclass, by name (the
@ -417,7 +399,6 @@ public class ClassInfo {
} }
} // if class not public } // if class not public
// Work around reflection bug, Hannes Wallnoefer 11/2002
if (Modifier.isPublic (method.getModifiers ())) if (Modifier.isPublic (method.getModifiers ()))
method.setAccessible (true); method.setAccessible (true);
// save it // save it
@ -537,7 +518,6 @@ public class ClassInfo {
} }
} // for } // for
} // if class not public } // if class not public
// Work around reflection bug, Hannes Wallnoefer 11/2002
if (Modifier.isPublic (method.getModifiers ())) if (Modifier.isPublic (method.getModifiers ()))
method.setAccessible (true); method.setAccessible (true);
// save it // save it

View file

@ -25,7 +25,7 @@ import org.apache.xmlrpc.*;
public class Server implements IPathElement, Runnable { public class Server implements IPathElement, Runnable {
public static final String version = "1.2 RC2 2002/12/05"; public static final String version = "1.2 RC1 2002/12/03";
public final long starttime; public final long starttime;
// if true we only accept RMI and XML-RPC connections from // if true we only accept RMI and XML-RPC connections from

View file

@ -1,46 +0,0 @@
// DbColumn.java
// Copyright 2002 Hannes Wallnoefer, Helma.org
package helma.objectmodel.db;
/**
* A class that encapsulates the Column name and data type of a
* column in a relational table.
*/
public final class DbColumn {
private final String name;
private final int type;
private final Relation relation;
/**
* Constructor
*/
public DbColumn (String name, int type, Relation rel) {
this.name = name;
this.type = type;
this.relation = rel;
}
/**
* Get the column name.
*/
public String getName() {
return name;
}
/**
* Get this columns SQL data type.
*/
public int getType() {
return type;
}
/**
* Return the relation associated with this column. May be null.
*/
public Relation getRelation() {
return relation;
}
}

View file

@ -13,8 +13,8 @@ import java.util.StringTokenizer;
import java.sql.*; import java.sql.*;
import com.workingdogs.village.*; import com.workingdogs.village.*;
/** /**
* A DbMapping describes how a certain type of Nodes is to mapped to a * A DbMapping describes how a certain type of Nodes is to mapped to a
* relational database table. Basically it consists of a set of JavaScript property-to- * relational database table. Basically it consists of a set of JavaScript property-to-
* Database row bindings which are represented by instances of the Relation class. * Database row bindings which are represented by instances of the Relation class.
*/ */
@ -43,21 +43,15 @@ public final class DbMapping implements Updatable {
Relation subRelation; Relation subRelation;
Relation propRelation; Relation propRelation;
// if this defines a subnode mapping with groupby layer, // if this defines a subnode mapping with groupby layer, we need a DbMapping for those groupby nodes
// we need a DbMapping for those groupby nodes
DbMapping groupbyMapping; DbMapping groupbyMapping;
// Map of property names to Relations objects // Map of property names to Relations objects
HashMap prop2db; HashMap prop2db;
// Map of db columns to Relations objects. // Map of db columns to Relations objects
// Case insensitive, keys are stored in upper case so
// lookups must do a toUpperCase().
HashMap db2prop; HashMap db2prop;
// list of columns to fetch from db // list of columns to fetch from db
DbColumn[] columns = null; String[] columns = null;
// Map of db columns by name
HashMap columnMap;
// pre-rendered select statement // pre-rendered select statement
String select = null; String select = null;
@ -121,8 +115,6 @@ public final class DbMapping implements Updatable {
prop2db = new HashMap (); prop2db = new HashMap ();
db2prop = new HashMap (); db2prop = new HashMap ();
columnMap = new HashMap ();
parent = null; parent = null;
@ -193,7 +185,6 @@ public final class DbMapping implements Updatable {
keydef = null; keydef = null;
// same with columns and select string // same with columns and select string
columns = null; columns = null;
columnMap.clear();
select = null; select = null;
@ -220,8 +211,8 @@ public final class DbMapping implements Updatable {
rel.update (dbField, props); rel.update (dbField, props);
p2d.put (propName, rel); p2d.put (propName, rel);
if (rel.columnName != null && if (rel.columnName != null &&
(rel.reftype == Relation.PRIMITIVE || (rel.reftype == Relation.PRIMITIVE ||
rel.reftype == Relation.REFERENCE)) rel.reftype == Relation.REFERENCE))
d2p.put (rel.columnName.toUpperCase (), rel); d2p.put (rel.columnName.toUpperCase (), rel);
// app.logEvent ("Mapping "+propName+" -> "+dbField); // app.logEvent ("Mapping "+propName+" -> "+dbField);
} }
@ -240,7 +231,7 @@ public final class DbMapping implements Updatable {
if (subRelation == null) if (subRelation == null)
subRelation = new Relation (subnodeMapping, "_children", this, props); subRelation = new Relation (subnodeMapping, "_children", this, props);
subRelation.update (subnodeMapping, props); subRelation.update (subnodeMapping, props);
// if subnodes are accessed via access name or group name, // if subnodes are accessed via access name or group name,
// the subnode relation is also the property relation. // the subnode relation is also the property relation.
if (subRelation.accessor != null || subRelation.groupby != null) if (subRelation.accessor != null || subRelation.groupby != null)
propRelation = subRelation; propRelation = subRelation;
@ -259,7 +250,7 @@ public final class DbMapping implements Updatable {
} }
/** /**
* Method in interface Updatable. * Method in interface Updatable.
*/ */
public void remove () { public void remove () {
@ -488,7 +479,7 @@ public final class DbMapping implements Updatable {
groupbyMapping.typename = subRelation.groupbyPrototype; groupbyMapping.typename = subRelation.groupbyPrototype;
} }
public void setPropertyRelation (Relation rel) { public void setPropertyRelation (Relation rel) {
propRelation = rel; propRelation = rel;
} }
@ -576,11 +567,11 @@ public final class DbMapping implements Updatable {
/** /**
* Return an array of DbColumns for the relational table mapped by this DbMapping. * Return a Village Schema object for this DbMapping.
*/ */
public synchronized DbColumn[] getColumns() throws ClassNotFoundException, SQLException { public synchronized String[] getColumns() throws ClassNotFoundException, SQLException {
if (!isRelational ()) if (!isRelational ())
throw new SQLException ("Can't get columns for non-relational data mapping "+this); throw new SQLException ("Can't get Schema for non-relational data mapping");
if (source == null && parentMapping != null) if (source == null && parentMapping != null)
return parentMapping.getColumns (); return parentMapping.getColumns ();
// Use local variable cols to avoid synchronization (schema may be nulled elsewhere) // Use local variable cols to avoid synchronization (schema may be nulled elsewhere)
@ -589,46 +580,25 @@ public final class DbMapping implements Updatable {
// and build a string of column names. // and build a string of column names.
Connection con = getConnection (); Connection con = getConnection ();
Statement stmt = con.createStatement (); Statement stmt = con.createStatement ();
String t = getTableName(); ResultSet rs = stmt.executeQuery ("select * from "+getTableName()+" where 1 = 0");
if (t == null)
throw new SQLException ("Table name is null in getColumns() for "+this);
ResultSet rs = stmt.executeQuery (
new StringBuffer("SELECT * FROM ")
.append(t).append(" WHERE 1 = 0").toString());
if (rs == null) if (rs == null)
throw new SQLException ("Error retrieving columns for "+this); throw new SQLException ("Error retrieving DB scheme for "+this);
ResultSetMetaData meta = rs.getMetaData (); ResultSetMetaData meta = rs.getMetaData ();
// ok, we have the meta data, now loop through mapping... // ok, we have the meta data, now loop through mapping...
int ncols = meta.getColumnCount (); int ncols = meta.getColumnCount ();
columns = new DbColumn[ncols]; columns = new String[ncols];
for (int i=0; i<ncols; i++) { for (int i=0; i<ncols; i++) {
String colName = meta.getColumnName (i+1); columns[i] = meta.getColumnName (i+1);
Relation rel = columnNameToRelation (colName); Relation rel = columnNameToRelation (columns[i]);
columns[i] = new DbColumn (colName, meta.getColumnType (i+1), rel); if (rel == null || (rel.reftype != Relation.PRIMITIVE &&
rel.reftype != Relation.REFERENCE))
continue;
rel.setColumnType (meta.getColumnType (i+1));
} }
} }
return columns; return columns;
} }
public DbColumn getColumn (String columnName) throws ClassNotFoundException, SQLException {
DbColumn col = (DbColumn) columnMap.get(columnName);
if (col == null) {
DbColumn[] cols = columns;
if (cols == null)
cols = getColumns();
for (int i=0; i<cols.length; i++) {
if (columnName.equalsIgnoreCase (cols[i].getName())) {
col = cols[i];
break;
}
}
if (col == null)
throw new SQLException ("Column "+columnName+" not found in "+this);
columnMap.put (columnName, col);
}
return col;
}
public StringBuffer getSelect () throws SQLException, ClassNotFoundException { public StringBuffer getSelect () throws SQLException, ClassNotFoundException {
String sel = select; String sel = select;
if (sel != null) if (sel != null)
@ -649,8 +619,13 @@ public final class DbMapping implements Updatable {
if (table == null && parentMapping != null) if (table == null && parentMapping != null)
return parentMapping.needsQuotes (columnName); return parentMapping.needsQuotes (columnName);
try { try {
DbColumn col = getColumn (columnName); Relation rel = (Relation) db2prop.get (columnName.toUpperCase());
switch (col.getType()) { if (rel == null)
throw new SQLException ("Error retrieving relational schema for "+this);
// make sure columns are initialized and up to date
if (columns == null)
getColumns();
switch (rel.getColumnType()) {
case Types.CHAR: case Types.CHAR:
case Types.VARCHAR: case Types.VARCHAR:
case Types.LONGVARCHAR: case Types.LONGVARCHAR:

View file

@ -234,7 +234,7 @@ public final class Node implements INode, Serializable {
/** /**
* Constructor used for nodes being stored in a relational database table. * Constructor used for nodes being stored in a relational database table.
*/ */
public Node (DbMapping dbm, ResultSet rs, DbColumn[] columns, WrappedNodeManager nmgr) public Node (DbMapping dbm, ResultSet rs, String[] columns, WrappedNodeManager nmgr)
throws SQLException { throws SQLException {
this.nmgr = nmgr; this.nmgr = nmgr;
@ -265,35 +265,35 @@ public final class Node implements INode, Serializable {
for (int i=0; i<columns.length; i++) { for (int i=0; i<columns.length; i++) {
Relation rel = columns[i].getRelation(); Relation rel = dbm.columnNameToRelation (columns[i]);
if (rel == null || (rel.reftype != Relation.PRIMITIVE && if (rel == null || (rel.reftype != Relation.PRIMITIVE &&
rel.reftype != Relation.REFERENCE)) rel.reftype != Relation.REFERENCE))
continue; continue;
Property newprop = new Property (rel.propName, this); Property newprop = new Property (rel.propName, this);
switch (columns[i].getType()) { switch (rel.getColumnType()) {
case Types.BIT: case Types.BIT:
newprop.setBooleanValue (rs.getBoolean(columns[i].getName())); newprop.setBooleanValue (rs.getBoolean(columns[i]));
break; break;
case Types.TINYINT: case Types.TINYINT:
case Types.BIGINT: case Types.BIGINT:
case Types.SMALLINT: case Types.SMALLINT:
case Types.INTEGER: case Types.INTEGER:
newprop.setIntegerValue (rs.getLong(columns[i].getName())); newprop.setIntegerValue (rs.getLong(columns[i]));
break; break;
case Types.REAL: case Types.REAL:
case Types.FLOAT: case Types.FLOAT:
case Types.DOUBLE: case Types.DOUBLE:
newprop.setFloatValue (rs.getDouble(columns[i].getName())); newprop.setFloatValue (rs.getDouble(columns[i]));
break; break;
case Types.DECIMAL: case Types.DECIMAL:
case Types.NUMERIC: case Types.NUMERIC:
BigDecimal num = rs.getBigDecimal (columns[i].getName()); BigDecimal num = rs.getBigDecimal (columns[i]);
if (num == null) if (num == null)
break; break;
if (num.scale() > 0) if (num.scale() > 0)
@ -305,20 +305,20 @@ public final class Node implements INode, Serializable {
case Types.LONGVARBINARY: case Types.LONGVARBINARY:
case Types.VARBINARY: case Types.VARBINARY:
case Types.BINARY: case Types.BINARY:
newprop.setStringValue (rs.getString(columns[i].getName())); newprop.setStringValue (rs.getString(columns[i]));
break; break;
case Types.LONGVARCHAR: case Types.LONGVARCHAR:
case Types.CHAR: case Types.CHAR:
case Types.VARCHAR: case Types.VARCHAR:
case Types.OTHER: case Types.OTHER:
newprop.setStringValue (rs.getString(columns[i].getName())); newprop.setStringValue (rs.getString(columns[i]));
break; break;
case Types.DATE: case Types.DATE:
case Types.TIME: case Types.TIME:
case Types.TIMESTAMP: case Types.TIMESTAMP:
newprop.setDateValue (rs.getTimestamp(columns[i].getName())); newprop.setDateValue (rs.getTimestamp(columns[i]));
break; break;
case Types.NULL: case Types.NULL:
@ -327,7 +327,7 @@ public final class Node implements INode, Serializable {
// continue; // continue;
default: default:
newprop.setStringValue (rs.getString(columns[i].getName())); newprop.setStringValue (rs.getString(columns[i]));
break; break;
} }
@ -1837,7 +1837,7 @@ public final class Node implements INode, Serializable {
*/ */
public INode getNonVirtualParent () { public INode getNonVirtualParent () {
INode node = this; INode node = this;
for (int i=0; i<5; i++) { for (int i=0; i<3; i++) {
if (node == null) break; if (node == null) break;
if (node.getState() != Node.VIRTUAL) if (node.getState() != Node.VIRTUAL)
return node; return node;

View file

@ -580,13 +580,7 @@ public final class NodeManager {
try { try {
Connection con = dbm.getConnection (); Connection con = dbm.getConnection ();
st = con.createStatement (); st = con.createStatement ();
st.executeUpdate (new StringBuffer ("DELETE FROM ") st.executeUpdate ("DELETE FROM "+dbm.getTableName ()+" WHERE "+dbm.getIDField ()+" = "+node.getID ());
.append(dbm.getTableName ())
.append(" WHERE ")
.append(dbm.getIDField())
.append(" = ")
.append(node.getID())
.toString());
} finally { } finally {
if (st != null) try { if (st != null) try {
st.close (); st.close ();
@ -611,11 +605,7 @@ public final class NodeManager {
Statement stmt = null; Statement stmt = null;
try { try {
Connection con = map.getConnection (); Connection con = map.getConnection ();
String q = new StringBuffer("SELECT MAX(") String q = "SELECT MAX("+map.getIDField()+") FROM "+map.getTableName();
.append(map.getIDField())
.append(") FROM ")
.append(map.getTableName())
.toString();
stmt = con.createStatement (); stmt = con.createStatement ();
ResultSet rs = stmt.executeQuery (q); ResultSet rs = stmt.executeQuery (q);
// check for empty table // check for empty table
@ -649,10 +639,7 @@ public final class NodeManager {
String retval = null; String retval = null;
try { try {
Connection con = map.getConnection (); Connection con = map.getConnection ();
String q = new StringBuffer("SELECT ") String q = "SELECT "+map.getIDgen()+".nextval FROM dual";
.append(map.getIDgen())
.append(".nextval FROM dual")
.toString();
stmt = con.createStatement(); stmt = con.createStatement();
ResultSet rs = stmt.executeQuery (q); ResultSet rs = stmt.executeQuery (q);
if (!rs.next ()) if (!rs.next ())
@ -690,28 +677,15 @@ public final class NodeManager {
Statement stmt = null; Statement stmt = null;
try { try {
String q = null; String q = null;
if (home.getSubnodeRelation() != null) { if (home.getSubnodeRelation() != null) {
// subnode relation was explicitly set // subnode relation was explicitly set
q = new StringBuffer("SELECT ") q = "SELECT "+idfield+" FROM "+table+" "+home.getSubnodeRelation();
.append(idfield)
.append(" FROM ")
.append(table)
.append(" ")
.append(home.getSubnodeRelation())
.toString();
} else { } else {
// let relation object build the query // let relation object build the query
q = new StringBuffer("SELECT ") q = "SELECT "+idfield+" FROM "+table + rel.buildQuery (home, home.getNonVirtualParent (), null, " WHERE ", true);
.append(idfield)
.append(" FROM ")
.append(table)
.append(rel.buildQuery (home,
home.getNonVirtualParent (), null,
" WHERE ", true))
.toString();
} }
if (logSql) if (logSql)
@ -721,7 +695,7 @@ public final class NodeManager {
if (rel.maxSize > 0) if (rel.maxSize > 0)
stmt.setMaxRows (rel.maxSize); stmt.setMaxRows (rel.maxSize);
ResultSet result = stmt.executeQuery (q); ResultSet result = stmt.executeQuery (q);
// problem: how do we derive a SyntheticKey from a not-yet-persistent Node? // problem: how do we derive a SyntheticKey from a not-yet-persistent Node?
Key k = rel.groupby != null ? home.getKey (): null; Key k = rel.groupby != null ? home.getKey (): null;
while (result.next ()) { while (result.next ()) {
@ -732,8 +706,8 @@ public final class NodeManager {
continue; continue;
// make the proper key for the object, either a generic DB key or a groupby key // make the proper key for the object, either a generic DB key or a groupby key
Key key = rel.groupby == null ? Key key = rel.groupby == null ?
(Key) new DbKey (rel.otherType, kstr) : (Key) new DbKey (rel.otherType, kstr) :
(Key) new SyntheticKey (k, kstr); (Key) new SyntheticKey (k, kstr);
retval.add (new NodeHandle (key)); retval.add (new NodeHandle (key));
// if these are groupby nodes, evict nullNode keys // if these are groupby nodes, evict nullNode keys
if (rel.groupby != null) { if (rel.groupby != null) {
@ -776,7 +750,7 @@ public final class NodeManager {
Connection con = dbm.getConnection (); Connection con = dbm.getConnection ();
Statement stmt = con.createStatement (); Statement stmt = con.createStatement ();
DbColumn[] columns = dbm.getColumns (); String[] columns = dbm.getColumns ();
StringBuffer q = dbm.getSelect (); StringBuffer q = dbm.getSelect ();
try { try {
if (home.getSubnodeRelation() != null) { if (home.getSubnodeRelation() != null) {
@ -834,7 +808,7 @@ public final class NodeManager {
if (missing > 0) { if (missing > 0) {
Connection con = dbm.getConnection (); Connection con = dbm.getConnection ();
Statement stmt = con.createStatement (); Statement stmt = con.createStatement ();
DbColumn[] columns = dbm.getColumns (); String[] columns = dbm.getColumns ();
StringBuffer q = dbm.getSelect (); StringBuffer q = dbm.getSelect ();
try { try {
String idfield = rel.groupby != null ? rel.groupby : dbm.getIDField (); String idfield = rel.groupby != null ? rel.groupby : dbm.getIDField ();
@ -861,10 +835,8 @@ public final class NodeManager {
q.append (") "); q.append (") ");
if (rel.groupby != null) { if (rel.groupby != null) {
q.append (rel.renderConstraints (home, home.getNonVirtualParent ())); q.append (rel.renderConstraints (home, home.getNonVirtualParent ()));
if (rel.order != null) { if (rel.order != null)
q.append (" ORDER BY "); q.append (" ORDER BY "+rel.order);
q.append (rel.order);
}
} }
if (logSql) if (logSql)
@ -965,27 +937,19 @@ public final class NodeManager {
Statement stmt = null; Statement stmt = null;
try { try {
String q = null; String q = null;
if (home.getSubnodeRelation() != null) { if (home.getSubnodeRelation() != null) {
// use the manually set subnoderelation of the home node // use the manually set subnoderelation of the home node
q = new StringBuffer("SELECT count(*) FROM ") q = "SELECT count(*) FROM "+table+" "+home.getSubnodeRelation();
.append(table)
.append(" ")
.append(home.getSubnodeRelation())
.toString();
} else { } else {
// let relation object build the query // let relation object build the query
q = new StringBuffer("SELECT count(*) FROM ") q = "SELECT count(*) FROM "+table + rel.buildQuery (home, home.getNonVirtualParent (), null, " WHERE ", false);
.append(table)
.append(rel.buildQuery (home, home.getNonVirtualParent(),
null, " WHERE ", false))
.toString();
} }
if (logSql) if (logSql)
app.logEvent ("### countNodes: "+q); app.logEvent ("### countNodes: "+q);
stmt = con.createStatement(); stmt = con.createStatement();
ResultSet rs = stmt.executeQuery (q); ResultSet rs = stmt.executeQuery (q);
@ -1026,13 +990,7 @@ public final class NodeManager {
Statement stmt = null; Statement stmt = null;
try { try {
String q = new StringBuffer("SELECT ") String q = "SELECT "+namefield+" FROM "+table+" ORDER BY "+namefield;
.append(namefield)
.append(" FROM ")
.append(table)
.append(" ORDER BY ")
.append(namefield)
.toString();
stmt = con.createStatement (); stmt = con.createStatement ();
if (logSql) if (logSql)
@ -1079,7 +1037,7 @@ public final class NodeManager {
Connection con = dbm.getConnection (); Connection con = dbm.getConnection ();
stmt = con.createStatement (); stmt = con.createStatement ();
DbColumn[] columns = dbm.getColumns (); String[] columns = dbm.getColumns ();
StringBuffer q = dbm.getSelect (); StringBuffer q = dbm.getSelect ();
q.append ("WHERE "); q.append ("WHERE ");
q.append (idfield); q.append (idfield);
@ -1141,7 +1099,7 @@ public final class NodeManager {
DbMapping dbm = rel.otherType; DbMapping dbm = rel.otherType;
Connection con = dbm.getConnection (); Connection con = dbm.getConnection ();
DbColumn[] columns = dbm.getColumns (); String[] columns = dbm.getColumns ();
StringBuffer q = dbm.getSelect (); StringBuffer q = dbm.getSelect ();
if (home.getSubnodeRelation () != null) { if (home.getSubnodeRelation () != null) {
// combine our key with the constraints in the manually set subnode relation // combine our key with the constraints in the manually set subnode relation
@ -1186,7 +1144,7 @@ public final class NodeManager {
} }
/** /**
* Get a DbMapping for a given prototype name. This is just a proxy * Get a DbMapping for a given prototype name. This is just a proxy
* method to the app's getDbMapping() method. * method to the app's getDbMapping() method.
*/ */
public DbMapping getDbMapping (String protoname) { public DbMapping getDbMapping (String protoname) {