Changes between Version 8 and Version 9 of DynamicDatasources


Ignore:
Timestamp:
04/12/06 16:22:34 (19 years ago)
Author:
grimnes
Comment:

--

Legend:

Unmodified
Added
Removed
Modified
  • DynamicDatasources

    v8 v9  
    3838 * Create a new class that extends CrawlerBase, and add a constructor that takes a DataSource, and calls super(); and setDataSource with the argument. Use this constructor to return a new crawler instance in your crawlerFactory class. 
    3939 * Now the crawlerFactory, DataSource and DataSource factory classes are all finished - only the meat remains, implementing the crawlObjects method. 
     40 * First you have to decide what configuration options your datasource will take. Have a look at [http://aperture.sourceforge.net/ontology/source.rdfs the aperture datasource schema] for a selection. In my case I will use source:rootURI to specify what RSS feed to crawl. Use the aperture utility method to get this:  
     41{{{ 
     42RDFContainer config=source.getConfiguration(); 
     43String root=ConfigurationUtil.getRootUrl(config); 
     44}}} 
     45 You are of course free to make up any config properties you want, but then the ConfigurationUtil class might not help you.  
     46 * Implementing the crawlObjects method is clearly quite datasource dependent, in my case I add the rome jar and jdom, and copy some example for how to read a feed. Some other hints: 
     47 ** Gnowsis uses java.util.logging for logging, so to get useful debugging message add this the top of your file:  
     48{{{  
     49Logger log=Logger.getLogger(RSSCrawler.class.getName()); 
     50}}} 
     51 ** The return value of crawlObjects is taken from ExitCode, it has predefined values for you.