Compare commits

..

109 Commits

Author SHA1 Message Date
jrandom
3e51584b3c 0.6.0.4 2005-09-01 20:27:35 +00:00
jrandom
4ff8a53084 2005-09-01 jrandom
* Don't send out a netDb store of a router if it is more than a few hours
      old, even if someone asked us for it.
2005-09-01 06:55:00 +00:00
jrandom
ccb73437c4 2005-08-31 jrandom
* Don't publish leaseSets to the netDb if they will never be looked for -
      namely, if they are for destinations that only establish outbound
      streams.  I2PTunnel's 'client' and 'httpclient' proxies have been
      modified to tell the router that it doesn't need to publish their
      leaseSet (by setting the I2CP config option 'i2cp.dontPublishLeaseSet'
      to 'true').
    * Don't publish the top 10 peer rankings of each router in the netdb, as
      it isn't being watched right now.
2005-09-01 00:26:20 +00:00
jrandom
b43114f61b 2005-08-31 jrandom
* Don't publish leaseSets to the netDb if they will never be looked for -
      namely, if they are for destinations that only establish outbound
      streams.  I2PTunnel's 'client' and 'httpclient' proxies have been
      modified to tell the router that it doesn't need to publish their
      leaseSet (by setting the I2CP config option 'i2cp.dontPublishLeaseSet'
      to 'true').
    * Don't publish the top 10 peer rankings of each router in the netdb, as
      it isn't being watched right now.
2005-09-01 00:20:16 +00:00
jrandom
9bd87ab511 make it work with any host charset or content charset 2005-08-31 09:50:23 +00:00
jrandom
b6ea55f7ef more error handling (thanks frosk) 2005-08-30 02:39:37 +00:00
jrandom
5f18cec97d 2005-08-29 jrandom
* Added the new test Floodfill netDb
2005-08-30 02:04:17 +00:00
jrandom
3ba921ec0e 2005-08-29 jrandom
* Added the new test Floodfill netDb
2005-08-30 01:59:11 +00:00
jrandom
e313da254c 2005-08-27 jrandom
* Minor logging and optimization tweaks in the router and SDK
    * Use ISO-8859-1 in the XML files (thanks redzara!)
    * The consolePassword config property can now be used to bypass the router
      console's nonce checking, allowing CLI restarts
2005-08-27 22:46:22 +00:00
jrandom
8660cf0d74 2005-08-27 jrandom
* Minor logging and optimization tweaks in the router and SDK
    * Use ISO-8859-1 in the XML files (thanks redzara!)
    * The consolePassword config property can now be used to bypass the router
      console's nonce checking, allowing CLI restarts
2005-08-27 22:15:35 +00:00
jrandom
e0bfdff152 TZ asap 2005-08-25 21:08:13 +00:00
jrandom
c27aed3603 fix up the entryId calc 2005-08-25 21:07:18 +00:00
jrandom
cdc6002f0e no message 2005-08-25 21:01:15 +00:00
jrandom
4cf3d9c1a2 HTTP file upload (rfc 1867) helper 2005-08-25 21:00:09 +00:00
jrandom
0473e08e21 remote w0rks 2005-08-25 20:59:46 +00:00
jrandom
346faa3de2 2005-08-24 jrandom
* Catch errors with corrupt tunnel messages more gracefully (no need to
      kill the thread and cause an OOM...)
    * Don't skip shitlisted peers for netDb store messages, as they aren't
      necessarily shitlisted by other people (though they probably are).
    * Adjust the netDb store per-peer timeout based on each particular peer's
      profile (timeout = 4x their average netDb store response time)
    * Don't republish leaseSets to *failed* peers - send them to peers who
      replied but just didn't know the value.
    * Set a 5 second timeout on the I2PTunnelHTTPServer reading the client's
      HTTP headers, rather than blocking indefinitely.  HTTP headers should be
      sent entirely within the first streaming packet anyway, so this won't be
      a problem.
    * Don't use the I2PTunnel*Server handler thread pool by default, as it may
      prevent any clients from accessing the server if the handlers get
      blocked by the streaming lib or other issues.
    * Don't overwrite a known status (OK/ERR-Reject/ERR-SymmetricNAT) with
      Unknown.
2005-08-24 22:55:25 +00:00
jrandom
5ec6dca64d 2005-08-23 jrandom
* Removed the concept of "no bandwidth limit" - if none is specified, its
      16KBps in/out.
    * Include ack packets in the per-peer cwin throttle (they were part of the
      bandwidth limit though).
    * Tweak the SSU cwin operation to get more accurrate estimates under
      congestions.
    * SSU improvements to resend more efficiently.
    * Added a basic scheduler to eepget to fetch multiple files sequentially.
2005-08-23 22:43:51 +00:00
jrandom
1a6b49cfb8 2005-08-23 jrandom
* Removed the concept of "no bandwidth limit" - if none is specified, its
      16KBps in/out.
    * Include ack packets in the per-peer cwin throttle (they were part of the
      bandwidth limit though).
    * Tweak the SSU cwin operation to get more accurrate estimates under
      congestions.
    * SSU improvements to resend more efficiently.
    * Added a basic scheduler to eepget to fetch multiple files sequentially.
2005-08-23 21:25:49 +00:00
cervantes
c7b75df390 Added announcement about the new Irc2P server at irc.freshcoffee.i2p 2005-08-22 13:03:11 +00:00
jrandom
f97c09291b 0.6.0.3 2005-08-21 19:21:50 +00:00
jrandom
8f2a5b403c * 2005-08-21 0.6.0.3 released
2005-08-21  jrandom
    * If we already have an established SSU session with the Charlie helping
      test us, cancel the test with the status of "unknown".
2005-08-21 18:39:05 +00:00
jrandom
ea41a90eae sanity checking 2005-08-21 18:37:57 +00:00
jrandom
b1dd29e64d added syndie.i2p and syndiemedia.i2p 2005-08-21 18:33:58 +00:00
jrandom
46e47c47ac ewps 2005-08-21 18:19:22 +00:00
jrandom
b7bf431f0d [these are not the droids you are looking for] 2005-08-21 18:08:05 +00:00
cervantes
7f432122d9 added irc.freshcoffee.i2p (new IRC server on the irc2p network) 2005-08-20 01:19:51 +00:00
cervantes
e7be8c6097 Added references to the new irc2p server: irc.freshcoffee.i2p 2005-08-20 01:18:38 +00:00
jrandom
adf56a16e1 2005-08-17 jrandom
* Revise the SSU peer testing protocol so that Bob verifies Charlie's
      viability before agreeing to Alice's request.  This doesn't work with
      older SSU peer test builds, but is backwards compatible (older nodes
      won't ask newer nodes to participate in tests, and newer nodes won't
      ask older nodes to either).
2005-08-17 20:16:27 +00:00
jrandom
11204b8a2b 2005-08-17 jrandom
* Revise the SSU peer testing protocol so that Bob verifies Charlie's
      viability before agreeing to Alice's request.  This doesn't work with
      older SSU peer test builds, but is backwards compatible (older nodes
      won't ask newer nodes to participate in tests, and newer nodes won't
      ask older nodes to either).
2005-08-17 20:05:01 +00:00
jrandom
cade27dceb added surrender.adab.i2p 2005-08-17 00:42:15 +00:00
smeghead
5597d28e59 Removing references to irc.duck.i2p, adding references to irc.arcturus.i2p, and replacing current ircProxy default destination string with "irc.postman.i2p,irc.arcturus.i2p" 2005-08-16 09:35:58 +00:00
jrandom
0502fec432 added terror.i2p 2005-08-15 18:44:04 +00:00
smeghead
a6714fc2de Adding irc.arcturus.i2p, a new server for the soon-to-be Irc2P network 2005-08-14 15:52:12 +00:00
jrandom
1219dadbd5 2005-08-12 jrandom
* Keep detailed stats on the peer testing, publishing the results in the
      netDb.
    * Don't overwrite the status with 'unknown' unless we haven't had a valid
      status in a while.
    * Make sure to avoid shitlisted peers for peer testing.
    * When we get an unknown result to a peer test, try again soon afterwards.
    * When a peer tells us that our address is different from what we expect,
      if we've done a recent peer test with a result of OK, fire off a peer
      test to make sure our IP/port is still valid.  If our test is old or the
      result was not OK, accept their suggestion, but queue up a peer test for
      later.
    * Don't try to do a netDb store to a shitlisted peer, and adjust the way
      we monitor netDb store progress (to clear up the high netDb.storePeers
      stat)
2005-08-12 23:54:46 +00:00
jrandom
77b995f5ed 2005-08-10 jrandom
* Deployed the peer testing implementation to be run every few minutes on
      each router, as well as any time the user requests a test manually.  The
      tests do not reconfigure the ports at the moment, merely determine under
      what conditions the local router is reachable.  The status shown in the
      top left will be "ERR-SymmetricNAT" if the user's IP and port show up
      differently for different peers, "ERR-Reject" if the router cannot
      receive unsolicited packets or the peer helping test could not find a
      collaborator, "Unknown" if the test has not been run or the test
      participants were unreachable, or "OK" if the router can receive
      unsolicited connections and those connections use the same IP and port.
2005-08-10 23:55:40 +00:00
jrandom
2f53b9ff68 0.6.0.2 2005-08-09 18:55:31 +00:00
jrandom
d84d045849 deal with full windows without *cough* NPEs
(how many times can I cvs rtag -F before going crazy?)
2005-08-08 21:20:08 +00:00
jrandom
d8e72dfe48 foo 2005-08-08 20:49:17 +00:00
jrandom
88b9f7a74c "ERROR [eive on 8887] uter.transport.udp.UDPReceiver: Dropping inbound packet with 1 queued for 1912 packet handlers: Handlers: 3 handler 0 state: 2 handler 1 state: 2 handler 2 state: 2"
state = 2 means all three handlers are blocking on udpReceiver.receive())
this can legitimately happen if the bandwidth limiter or router throttle chokes the receive for >= 1s.
2005-08-08 20:42:13 +00:00
jrandom
6a19501214 2005-08-08 jrandom
* Add a configurable throttle to the number of concurrent outbound SSU
      connection negotiations (via i2np.udp.maxConcurrentEstablish=4).  This
      may help those with slow connections to get integrated at the start.
    * Further fixlets to the streaming lib
2005-08-08 20:35:50 +00:00
jrandom
ba30b56c5f 2005-08-07 Complication
* Display the average clock skew for both SSU and TCP connections
2005-08-07  jrandom
    * Fixed the long standing streaming lib bug where we could lose the first
      packet on retransmission.
    * Avoid an NPE when a message expires on the SSU queue.
    * Adjust the streaming lib's window growth factor with an additional
      Vegas-esque congestion detection algorithm.
    * Removed an unnecessary SSU session drop
    * Reduced the MTU (until we get a working PMTU lib)
    * Deferr tunnel acceptance until we know how to reach the next hop,
      rejecting it if we can't find them in time.
    * If our netDb store of our leaseSet fails, give it a few seconds before
      republishing.
2005-08-07 19:31:58 +00:00
jrandom
a375e4b2ce added more postman services (w3wt) 2005-08-07 19:27:22 +00:00
duck
44fd71e17f added i2p-bt.postman.i2p 2005-08-05 21:20:30 +00:00
smeghead
b41c378de9 Removed reference and link to Invisiblechat/IIP from the router console greeting page (because IIP's dead, Jim... how many times does it need to be said?) and added irc.postman.i2p. 2005-08-05 19:20:52 +00:00
jrandom
4ce6b308b3 * 2005-08-03 0.6.0.1 released
2005-08-03  jrandom
    * Backed out an inadvertant change to the netDb store redundancy factor.
    * Verify tunnel participant caching.
    * Logging cleanup
2005-08-03 18:58:12 +00:00
duck
72c6e7d1c5 2005-08-01 duck
* Update IzPack to 3.7.2 (build 2005.04.22). This fixes bug #82.
2005-08-02 03:26:51 +00:00
duck
7ca3f22e77 2005-08-01 duck
* Update IzPack to 3.7.2 (build 2005.04.22)
      This fixes bug #82
2005-08-02 03:25:51 +00:00
duck
59790dafef 2005-08-01 duck
* Fix an addressbook NPE when a new hostname from the master addressbook
      didn't exist in the router addressbook.
    * Fix an addressbook bug which caused subscriptions not to be parsed at
      all. (Oops!)
2005-08-01 13:35:11 +00:00
duck
7227cae6ef 2005-08-01 duck
* Fix an addressbook NPE when a new hostname from the master addressbook
      didn't exist in the router addressbook.
    * Fix an addressbook bug which caused subscriptions not to be parsed at
      all. (Oops!)
2005-08-01 13:34:10 +00:00
ragnarok
03bba51c1e * Fixed some issues with the merge logic that caused addressbooks to be written to disk even when unmodified.
* Fixed a bug that could result in a downloaded remote addressbook not being deleted, halting the update process.
2005-08-01 03:32:37 +00:00
ragnarok
0637050cbc No real reason for eepget to retry, addressbook will try again in an hour, and it makes updates take an absurdly long time. 2005-08-01 00:10:54 +00:00
ragnarok
7f58a68c5a Whoops! Forgot the new build file. I broke cvs! 2005-07-31 22:49:25 +00:00
ragnarok
8120b0397c Move addressbook off URL and on to EepGet. Should no longer leak dns lookups, but now only supports conditional GET with HTTP 1.1. If that's a big problem, it can be fixed in future. 2005-07-31 22:19:10 +00:00
ragnarok
fbe42b7dce Added HTTP 1.1 conditional GET support to EepGet. 2005-07-31 22:17:10 +00:00
jrandom
def24e34ad 2005-07-31 jrandom
* Adjust the netDb search and store per peer timeouts to match the average
      measured per peer success times, rather than huge fixed values.
    * Optimized and reverified the netDb peer selection / retrieval process
      within the kbuckets.
    * Drop TCP connections that don't have any useful activity in 10 minutes.
    * If i2np.udp.fixedPort=true, never change the externally published port,
      even if we are autodetecting the IP address.
(also includes most of the new peer/NAT testing, but thats not used atm)
2005-07-31 21:35:26 +00:00
duck
593253e6a3 update compilation target 2005-07-31 01:11:12 +00:00
smeghead
56dd4cb8b5 * added luckypunk.i2p to hosts.txt 2005-07-30 02:27:12 +00:00
jrandom
10c6f67500 oops 2005-07-28 20:33:27 +00:00
jrandom
5c1f968afa no message 2005-07-27 20:16:44 +00:00
jrandom
aaaf437d62 skip properly (DataHelper.read confusion) 2005-07-27 20:15:35 +00:00
jrandom
a8a866b5f6 * 2005-07-27 0.6 released
2005-07-27  jrandom
    * Enabled SSU as the default top priority transport, adjusting the
      config.jsp page accordingly.
    * Add verification fields to the SSU and TCP connection negotiation (not
      compatible with previous builds)
    * Enable the backwards incompatible tunnel crypto change as documented in
      tunnel-alt.html (have each hop encrypt the received IV before using it,
      then encrypt it again before sending it on)
    * Disable the I2CP encryption, leaving in place the end to end garlic
      encryption (another backwards incompatible change)
    * Adjust the protocol versions on the TCP and SSU transports so that they
      won't talk to older routers.
    * Fix up the config stats handling again
    * Fix a rare off-by-one in the SSU fragmentation
    * Reduce some unnecessary netDb resending by inluding the peers queried
      successfully in the store redundancy count.
2005-07-27 19:03:43 +00:00
jrandom
aeb8f02269 2005-07-22 jrandom
* Use the small thread pool for I2PTunnelHTTPServer (already used for
      I2PTunnelServer)
    * Minor memory churn reduction in I2CP
    * Small stats update
2005-07-23 00:15:56 +00:00
jrandom
45767360ab 2005-07-21 jrandom
* Fix in the SDK for a bug which would manifest itself as misrouted
      streaming packets when a destination has many concurrent streaming
      connections (thanks duck!)
    * No more "Graceful shutdown in -18140121441141s"
2005-07-21 22:37:14 +00:00
jrandom
3563aa2e4d 2005-07-20 jrandom
* Allow the user to specify an external port # for SSU even if the external
      host isn't specified (thanks duck!)
2005-07-20 19:24:47 +00:00
jrandom
843d5b625a 2005-07-19 jrandom
* Further preparation for removing I2CP crypto
    * Added some validation to the DH key agreement (thanks $anon)
    * Validate tunnel data message expirations (though not really a problem,
      since tunnels expire)
    * Minor PRNG threading cleanup
2005-07-19 21:00:25 +00:00
cervantes
0f8ede85ca 2005-07-15 cervantes
* Added workaround for an odd win32 bug in the stats configuration
	  console page which meant only the first checkbox selection was saved.

2005-07-15  Romster
	* Added per group selection toggles in the stats configuration console
	  page.
2005-07-16 12:52:35 +00:00
jrandom
9267d7cae2 more n3ws 2005-07-13 21:59:01 +00:00
jrandom
dade5a981b 2005-07-13 jrandom
* Fixed a recently injected bug in the multitransport bidding which had
      allowed an essentially arbitrary choice of transports, rather than the
      properly ordered choice.
(getLatency() != getLatencyMs().  duh)
2005-07-13 20:07:31 +00:00
jrandom
f873cba27e 2005-07-13 jrandom
* Fixed a long standing bug where we weren't properly comparing session
      tags but instead largely depending upon comparing their hashCode,
      causing intermittent decryption errors.
2005-07-13 18:20:43 +00:00
jrandom
108dec53a5 * mixing a revert and some logging updates... (crosses fingers) 2005-07-12 22:30:13 +00:00
jrandom
e9592ed400 2005-07-12 jrandom
* Add some data duplication to avoid a recently injected concurrency problem
      in the session tag manager (thanks redzara and romster).
2005-07-12 21:26:07 +00:00
cervantes
4c230522a2 typo *ahem* 2005-07-12 03:56:42 +00:00
jrandom
16bd19c6dc added bash.i2p, stats.i2p 2005-07-11 23:23:22 +00:00
jrandom
b4b6d49d34 ssu testing 2005-07-11 23:16:41 +00:00
jrandom
9d5f16a889 2005-07-11 jrandom
* Reduced the growth factor on the slow start and congestion avoidance for
      the streaming lib.
    * Adjusted some of the I2PTunnelServer threading to use a small pool of
      handlers, rather than launching off new threads which then immediately
      launch off an I2PTunnelRunner instance (which launches 3 more threads..)
    * Don't persist session keys / session tags (not worth it, for now)
    * Added some detection and handling code for duplicate session tags being
      delivered (root cause still not addressed)
    * Make the PRNG's buffer size configurable (via the config property
      "i2p.prng.totalBufferSizeKB=4096")
    * Disable SSU flooding by default (duh)
    * Updates to the StreamSink apps for better throttling tests.
2005-07-11 23:06:23 +00:00
jrandom
51c492b842 no message 2005-07-09 23:02:19 +00:00
jrandom
d3380228ac * you mean 3f != 0x3f? [duh]
* minor cleanups
2005-07-09 22:58:22 +00:00
jrandom
ad47bf5da3 * moved the inbound partial messages to the PeerState itself, reducing lock contention in the InboundMessageFragments and transparently dropping failed messages when we drop old peer states 2005-07-07 22:27:44 +00:00
jrandom
76e8631e31 included IV tagging info 2005-07-07 21:16:57 +00:00
jrandom
f688b9112d 2005-07-05
* Use a buffered PRNG, pulling the PRNG data off a larger precalculated
      buffer, rather than the underlying PRNG's (likely small) one, which in
      turn reduces the frequency of recalcing.
    * More tuning to reduce temporary allocation churn
2005-07-05 22:08:56 +00:00
jrandom
18d3f5d25d 2005-07-04 jrandom
* Within the tunnel, use xor(IV, msg[0:16]) as the flag to detect dups,
      rather than the IV by itself, preventing an attack that would let
      colluding internal adversaries tag a message to determine that they are
      in the same tunnel.  Thanks dvorak for the catch!
    * Drop long inactive profiles on startup and shutdown
    * /configstats.jsp: web interface to pick what stats to log
    * Deliver more session tags to account for wider window sizes
    * Cache some intermediate values in our HMACSHA256 and BC's HMAC
    * Track the client send rate (stream.sendBps and client.sendBpsRaw)
    * UrlLauncher: adjust the browser selection order
    * I2PAppContext: hooks for dummy HMACSHA256 and a weak PRNG
    * StreamSinkClient: add support for sending an unlimited amount of data
    * Migrate the tests out of the default build jars

2005-06-22  Comwiz
    * Migrate the core tests to junit
2005-07-04 20:44:17 +00:00
comwiz
440cf2c983 2005-03-23 Comwiz
* Phase 1 of the unit test bounty completed. (The router build script was modified not to build the router
 tests because of a broken dependancy on the core tests. This should be fixed in
 phase 3 of the unit test bounty.)
2005-06-23 02:11:04 +00:00
duck
adeb09576a util/PooledRandomSource.java 2005-06-03 20:23:32 +00:00
cervantes
fd52bcf8cd added archive.i2p, www.fr.i2p, romster.i2p, marshmallow.i2p, openforums.i2p 2005-05-26 04:51:24 +00:00
duck
c2696bba00 2005-05-25 duck
* Fixed PRNG bug (bugzilla #107)
2005-05-25 21:32:38 +00:00
connelly
fef9d57483 removed duplicate manveru.i2p 2005-05-10 05:53:18 +00:00
cervantes
c250692ef0 added bittorrent.i2p - new home for brittanytracker 2005-05-04 05:52:55 +00:00
jrandom
2a6024e196 end of first round of ssu testing 2005-05-03 00:53:53 +00:00
jrandom
835662b3c9 2005-05-01 jrandom
* Added a substantial optimization to the AES engine by caching the
      prepared session keys (duh).
2005-05-02 02:35:16 +00:00
jrandom
6b5b880ab6 * replaced explicit NACKs and numACKs with ACK bitfields for high congestion links
* increased the maximum number of fragments allowed in a message from 31 to 127,
  reducing the maximum fragment size to 8KB and moving around some bits in the fragment
  info.  This is not backwards compatible.
* removed the old (hokey) congestion control description, replacing it with the TCP-esque
  algorithm implemented
note: the code for the ACK bitfields and fragment info changes have not yet been
implemented, so the old version of this document describes whats going on in the live net.
the new bitfields / fragment info should be deployed in the next day or so (hopefully :)
2005-05-01 20:08:08 +00:00
jrandom
3de23d4206 2005-05-01 jrandom
* Cleaned up the peers page a bit more.
more udp stuff:
* add new config option: i2np.udp.alwaysPreferred=true to adjust the bidding
  so that UDP is picked first, even if a TCP connection exists
* fixed the initial clock skew problem (duh)
* reduced the MTU to 576 (largest nearly-universally-safe, and allows a
  tunnel message in 2 fragments)
* handle some races @ connection establishment (thanks duck!)
* if there are more ACKs than we can send in a packet, reschedule another
  ACK immediately
2005-05-01 17:21:48 +00:00
jrandom
ea82f2a8cc oops (thanks newkid!) 2005-05-01 01:35:23 +00:00
jrandom
b5ad7642bc 2005-04-30 jrandom
* Added a small new page to the web console (/peers.jsp) which contains
      the peer connection information.  This will be cleaned up a lot more
      before 0.6 is out, but its a start.
2005-05-01 00:48:15 +00:00
jrandom
0fbe84e9f0 2005-04-30 jrandom
* Reduced some SimpleTimer churn
* add hooks for per-peer choking in the outbound message queue - if/when a
  peer reaches their cwin, no further messages will enter the 'active' pool
  until there are more bytes available.  other messages waiting (either later
  on in the same priority queue, or in the queues for other priorities) may
  take that slot.
* when we have a message acked, release the acked size to the congestion
  window (duh), rather than waiting for the second to expire and refill the
  capacity.
* send packets in a volley explicitly, waiting until we can allocate the full
  cwin size for that message
2005-04-30 23:26:18 +00:00
jrandom
8063889d23 udp updates:
* more stats. including per-peer KBps (updated every second)
* improved blocking/timeout situations on the send queue
* added drop simulation hook
* provide logical RTO limits
2005-04-30 03:14:09 +00:00
jrandom
6e1ac8e173 added elf.i2p, de-ebooks.i2p, i2pchan.i2p, longhorn.i2p 2005-04-29 22:26:12 +00:00
jrandom
1b0bb5ea19 2005-04-29 jrandom
* Reduce the peer profile stat coallesce overhead by inlining it with the
      reorganize.
    * Limit each transport to at most one address (any transport that requires
      multiple entry points can include those alternatives in the address).
udp stuff:
* change the UDP transport's style from "udp" to "SSUv1"
* keep track of each peer's skew
* properly handle session reestablishment over an existing session, rather
  than requiring both sides to expire first
2005-04-29 06:24:12 +00:00
jrandom
4ce51261f1 2005-04-28 jrandom
* More fixes for the I2PTunnel "other" interface handling (thanks nelgin!)
    * Add back the code to handle bids from multiple transports (though there
      is still only one transport enabled by default)
    * Adjust the router's queueing of outbound client messages when under
      heavy load by running the preparatory job in the client's I2CP handler
      thread, thereby blocking additional outbound messages when the router is
      hosed.
    * No need to validate or persist a netDb entry if we already have it
And for some udp stuff:
* only bid on what we know (duh)
* reduceed the queue size in the UDPSender itself, so that ACKs go
  through more quickly, leaving the payload messages to queue up in
  the outbound fragment scheduler
* rather than /= 2 on congestion, /= 2/3 (still AIMD, but less drastic)
* adjust the fragment selector so a wsiz throttle won't force extra
  volleys
* mark congestion when it occurs, not after the message has been
  ACKed
* when doing a round robin over the active messages, move on to the
  next after a full volley, not after each packet (causing less "fair"
  performance but better latency)
* reduced the lock contention in the inboundMessageFragments by
  moving the ack and complete queues to the ACKSender and
  MessageReceiver respectively (each of which have their own
  threads)
* prefer new and existing UDP sessions to new TCP sessions, but
  prefer existing TCP sessions to new UDP sessions
2005-04-28 21:54:27 +00:00
jrandom
6e34d9b73e added amobius.i2p 2005-04-28 02:11:02 +00:00
jrandom
6e01637400 added google.i2p 2005-04-27 21:30:53 +00:00
jrandom
9a96798f9f added mrplod.i2p 2005-04-27 03:58:00 +00:00
smeghead
c9db6f87d1 2005-04-25 smeghead
* Added button to router console for manual update checks.
    * Fixed bug in configupdate.jsp that caused the proxy port to be updated
      every time the form was submitted even if it hadn't changed.
2005-04-26 02:59:23 +00:00
jrandom
567ce84e1e * randomized the shitlist duration (still with exponential backoff though)
* fail UDP sessions after two consecutive failed messages in different minutes
* honor UDP reconnections
2005-04-25 16:29:48 +00:00
jrandom
cde7ac7e52 2005-04-24 jrandom
* Added a pool of PRNGs using a different synchronization technique,
      hopefully sufficient to work around IBM's PRNG bugs until we get our
      own Fortuna.
    * In the streaming lib, don't jack up the RTT on NACK, and have the window
      size bound the not-yet-ready messages to the peer, not the unacked
      message count (not sure yet whether this is worthwile).
    * Many additions to the messageHistory log.
    * Handle out of order tunnel fragment delivery (not an issue on the live
      net with TCP, but critical with UDP).
2005-04-24 18:44:59 +00:00
jrandom
b2f0d17e94 2005-04-24 jrandom
* Added a pool of PRNGs using a different synchronization technique,
      hopefully sufficient to work around IBM's PRNG bugs until we get our
      own Fortuna.
    * In the streaming lib, don't jack up the RTT on NACK, and have the window
      size bound the not-yet-ready messages to the peer, not the unacked
      message count (not sure yet whether this is worthwile).
    * Many additions to the messageHistory log.
    * Handle out of order tunnel fragment delivery (not an issue on the live
      net with TCP, but critical with UDP).
and for udp stuff:
* implemented tcp-esque rto code in the udp transport
* make sure we don't ACK too many messages at once
* transmit fragments in a simple (nonrandom) order so that we can more easily
  adjust timeouts/etc.
* let the active outbound pool grow dynamically if there are outbound slots to
  spare
* use a simple decaying bloom filter at the UDP level to drop duplicate resent
  packets.
2005-04-24 18:42:02 +00:00
polecat
dae6be14b7 I removed those dumb platform specific makefiles. They weren't doing what they ought anyway. If there are platform specific issues, someone please tell me and I'll provide support for it here. Or patch it yourself.
And this is the big "Fix the Parser" patch.  It turns the sam_parse function in src/parse.c into something that actually works.  Generating the argument list from an incoming SAM thingy is a bit memory churn-y; perhaps when I have time I'll replace all those strdups with structures that simply track the (start,end) indices.
Oh and also I moved i2p-ping to the new system.  Which required 0 change in code.  All I did was fix the Makefile, and add shared library libtool support.  Anyway, so enjoy folks.  It's rare I'm this productive
- polecat
2005-04-23 03:28:40 +00:00
jrandom
20cec857d2 signed with the latest 2005-04-21 16:26:46 +00:00
aum
739f694cfe Node shutdown now uses halt() 2005-04-21 03:10:16 +00:00
aum
84779002fb now builds a working Q console 2005-04-20 21:35:05 +00:00
384 changed files with 21754 additions and 5347 deletions

View File

@@ -6,8 +6,7 @@
<property name="dist" location="dist"/>
<property name="jar" value="addressbook.jar"/>
<property name="war" value="addressbook.war"/>
<property name="servlet" value="../jetty/jettylib/javax.servlet.jar"/>
<target name="init">
<mkdir dir="${build}"/>
<mkdir dir="${dist}"/>
@@ -22,7 +21,12 @@
<target name="compile" depends="init">
<javac debug="true" deprecation="on" source="1.3" target="1.3"
srcdir="${src}" destdir="${build}" classpath="${servlet}"/>
srcdir="${src}" destdir="${build}">
<classpath>
<pathelement location="../../core/java/build/i2p.jar" />
<pathelement location="../jetty/jettylib/javax.servlet.jar" />
</classpath>
</javac>
</target>
<target name="jar" depends="compile">

View File

@@ -24,11 +24,12 @@ package addressbook;
import java.util.Map;
import java.util.HashMap;
import java.util.Iterator;
import java.net.URL;
import java.net.HttpURLConnection;
import java.io.File;
import java.io.IOException;
import net.i2p.I2PAppContext;
import net.i2p.util.EepGet;
/**
* An address book for storing human readable names mapped to base64 i2p
* destinations. AddressBooks can be created from local and remote files, merged
@@ -65,14 +66,18 @@ public class AddressBook {
* where key is a human readable name, and value is a base64 i2p
* destination.
*/
public AddressBook(URL url) {
this.location = url.getHost();
public AddressBook(String url, String proxyHost, int proxyPort) {
this.location = url;
EepGet get = new EepGet(I2PAppContext.getGlobalContext(), true,
proxyHost, proxyPort, 0, "addressbook.tmp", url, true,
null);
get.fetch();
try {
this.addresses = ConfigParser.parse(url);
this.addresses = ConfigParser.parse(new File("addressbook.tmp"));
} catch (IOException exp) {
this.addresses = new HashMap();
}
new File("addressbook.tmp").delete();
}
/**
@@ -83,43 +88,19 @@ public class AddressBook {
* @param subscription
* A Subscription instance pointing at a remote address book.
*/
public AddressBook(Subscription subscription) {
public AddressBook(Subscription subscription, String proxyHost, int proxyPort) {
this.location = subscription.getLocation();
try {
URL url = new URL(subscription.getLocation());
HttpURLConnection connection = (HttpURLConnection) url
.openConnection();
if (subscription.getEtag() != null) {
connection.addRequestProperty("If-None-Match", subscription
.getEtag());
}
if (subscription.getLastModified() != null) {
connection.addRequestProperty("If-Modified-Since", subscription
.getLastModified());
}
connection.connect();
if (connection.getResponseCode() == HttpURLConnection.HTTP_NOT_MODIFIED) {
connection.disconnect();
this.addresses = new HashMap();
return;
}
if (connection.getHeaderField("ETag") != null) {
subscription.setEtag(connection.getHeaderField("ETag"));
}
if (connection.getHeaderField("Last-Modified") != null) {
subscription.setLastModified(connection
.getHeaderField("Last-Modified"));
}
} catch (IOException exp) {
}
try {
this.addresses = ConfigParser.parse(new URL(subscription
.getLocation()));
EepGet get = new EepGet(I2PAppContext.getGlobalContext(), true,
proxyHost, proxyPort, 0, "addressbook.tmp",
subscription.getLocation(), true, subscription.getEtag());
get.fetch();
subscription.setEtag(get.getETag());
try {
this.addresses = ConfigParser.parse(new File("addressbook.tmp"));
} catch (IOException exp) {
this.addresses = new HashMap();
}
new File("addressbook.tmp").delete();
}
/**
@@ -181,7 +162,7 @@ public class AddressBook {
* @param log
* The log to write messages about new addresses or conflicts to.
*/
public void merge(AddressBook other, Log log) {
public void merge(AddressBook other, boolean overwrite, Log log) {
Iterator otherIter = other.addresses.keySet().iterator();
while (otherIter.hasNext()) {
@@ -189,7 +170,7 @@ public class AddressBook {
String otherValue = (String) other.addresses.get(otherKey);
if (otherKey.endsWith(".i2p") && otherValue.length() >= 516) {
if (this.addresses.containsKey(otherKey)) {
if (this.addresses.containsKey(otherKey) && !overwrite) {
if (!this.addresses.get(otherKey).equals(otherValue)
&& log != null) {
log.append("Conflict for " + otherKey + " from "
@@ -197,28 +178,19 @@ public class AddressBook {
+ ". Destination in remote address book is "
+ otherValue);
}
} else {
} else if (!this.addresses.containsKey(otherKey)
|| !this.addresses.get(otherKey).equals(otherValue)) {
this.addresses.put(otherKey, otherValue);
this.modified = true;
if (log != null) {
log.append("New address " + otherKey
+ " added to address book.");
+ " added to address book.");
}
}
}
}
}
/**
* Merge this AddressBook with other, without logging.
*
* @param other
* An AddressBook to merge with.
*/
public void merge(AddressBook other) {
this.merge(other, null);
}
/**
* Write the contents of this AddressBook out to the File file. If the file
* cannot be writen to, this method will silently fail.
@@ -243,4 +215,4 @@ public class AddressBook {
public void write() {
this.write(new File(this.location));
}
}
}

View File

@@ -27,7 +27,6 @@ import java.util.List;
import java.util.LinkedList;
import java.util.Iterator;
import java.io.*;
import java.net.URL;
/**
* Utility class providing methods to parse and write files in config file
@@ -86,24 +85,6 @@ public class ConfigParser {
return result;
}
/**
* Return a Map using the contents of the file at url. See
* parseBufferedReader for details of the input format.
*
* @param url
* A url pointing to a file to parse.
* @return A Map containing the key, value pairs from url.
* @throws IOException
* if url cannot be read.
*/
public static Map parse(URL url) throws IOException {
InputStream urlStream;
urlStream = url.openConnection().getInputStream();
BufferedReader input = new BufferedReader(new InputStreamReader(
urlStream));
return ConfigParser.parse(input);
}
/**
* Return a Map using the contents of the File file. See parseBufferedReader
* for details of the input format.

View File

@@ -58,15 +58,14 @@ public class Daemon {
*/
public static void update(AddressBook master, AddressBook router,
File published, SubscriptionList subscriptions, Log log) {
String routerLocation = router.getLocation();
master.merge(router);
router.merge(master, true, null);
Iterator iter = subscriptions.iterator();
while (iter.hasNext()) {
master.merge((AddressBook) iter.next(), log);
router.merge((AddressBook) iter.next(), false, log);
}
master.write(new File(routerLocation));
router.write();
if (published != null)
master.write(published);
router.write(published);
subscriptions.write();
}
@@ -101,7 +100,8 @@ public class Daemon {
defaultSubs.add("http://i2p/NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA/i2p/hosts.txt");
SubscriptionList subscriptions = new SubscriptionList(subscriptionFile,
etagsFile, lastModifiedFile, defaultSubs);
etagsFile, lastModifiedFile, defaultSubs, (String) settings
.get("proxy_host"), Integer.parseInt((String) settings.get("proxy_port")));
Log log = new Log(logFile);
Daemon.update(master, router, published, subscriptions, log);
@@ -154,11 +154,6 @@ public class Daemon {
while (true) {
settings = ConfigParser.parse(settingsFile, defaultSettings);
System.setProperty("proxySet", "true");
System.setProperty("http.proxyHost", (String) settings
.get("proxy_host"));
System.setProperty("http.proxyPort", (String) settings
.get("proxy_port"));
long delay = Long.parseLong((String) settings.get("update_delay"));
if (delay < 1) {
delay = 1;

View File

@@ -33,6 +33,8 @@ import java.util.List;
public class SubscriptionIterator implements Iterator {
private Iterator subIterator;
private String proxyHost;
private int proxyPort;
/**
* Construct a SubscriptionIterator using the Subscriprions in List subscriptions.
@@ -40,8 +42,10 @@ public class SubscriptionIterator implements Iterator {
* @param subscriptions
* List of Subscription objects that represent address books.
*/
public SubscriptionIterator(List subscriptions) {
public SubscriptionIterator(List subscriptions, String proxyHost, int proxyPort) {
this.subIterator = subscriptions.iterator();
this.proxyHost = proxyHost;
this.proxyPort = proxyPort;
}
@@ -49,15 +53,15 @@ public class SubscriptionIterator implements Iterator {
* @see java.util.Iterator#hasNext()
*/
public boolean hasNext() {
return subIterator.hasNext();
return this.subIterator.hasNext();
}
/* (non-Javadoc)
* @see java.util.Iterator#next()
*/
public Object next() {
Subscription sub = (Subscription) subIterator.next();
return new AddressBook(sub);
Subscription sub = (Subscription) this.subIterator.next();
return new AddressBook(sub, this.proxyHost, this.proxyPort);
}
/* (non-Javadoc)

View File

@@ -42,6 +42,10 @@ public class SubscriptionList {
private File etagsFile;
private File lastModifiedFile;
private String proxyHost;
private int proxyPort;
/**
* Construct a SubscriptionList using the urls from locationsFile and, if
@@ -58,15 +62,18 @@ public class SubscriptionList {
* GET. The file is in the format "url=leastmodified".
*/
public SubscriptionList(File locationsFile, File etagsFile,
File lastModifiedFile, List defaultSubs) {
File lastModifiedFile, List defaultSubs, String proxyHost,
int proxyPort) {
this.subscriptions = new LinkedList();
this.etagsFile = etagsFile;
this.lastModifiedFile = lastModifiedFile;
List locations;
this.proxyHost = proxyHost;
this.proxyPort = proxyPort;
Map etags;
Map lastModified;
String location;
locations = ConfigParser.parseSubscriptions(locationsFile, defaultSubs);
List locations = ConfigParser.parseSubscriptions(locationsFile,
defaultSubs);
try {
etags = ConfigParser.parse(etagsFile);
} catch (IOException exp) {
@@ -80,11 +87,9 @@ public class SubscriptionList {
Iterator iter = locations.iterator();
while (iter.hasNext()) {
location = (String) iter.next();
subscriptions.add(new Subscription(location, (String) etags
this.subscriptions.add(new Subscription(location, (String) etags
.get(location), (String) lastModified.get(location)));
}
iter = this.iterator();
}
/**
@@ -94,7 +99,8 @@ public class SubscriptionList {
* @return A SubscriptionIterator.
*/
public SubscriptionIterator iterator() {
return new SubscriptionIterator(this.subscriptions);
return new SubscriptionIterator(this.subscriptions, this.proxyHost,
this.proxyPort);
}
/**

View File

@@ -111,12 +111,12 @@ public class Bogobot extends PircBot {
_botShutdownPassword = config.getProperty("botShutdownPassword", "take off eh");
_ircChannel = config.getProperty("ircChannel", "#i2p-chat");
_ircServer = config.getProperty("ircServer", "irc.duck.i2p");
_ircServer = config.getProperty("ircServer", "irc.postman.i2p");
_ircServerPort = Integer.parseInt(config.getProperty("ircServerPort", "6668"));
_isLoggerEnabled = Boolean.valueOf(config.getProperty("isLoggerEnabled", "true")).booleanValue();
_loggedHostnamePattern = config.getProperty("loggedHostnamePattern", "");
_logFilePrefix = config.getProperty("logFilePrefix", "irc.duck.i2p.i2p-chat");
_logFilePrefix = config.getProperty("logFilePrefix", "irc.postman.i2p.i2p-chat");
_logFileRotationInterval = config.getProperty("logFileRotationInterval", INTERVAL_DAILY);
_isRoundTripDelayEnabled = Boolean.valueOf(config.getProperty("isRoundTripDelayEnabled", "false")).booleanValue();

View File

@@ -109,8 +109,9 @@ public class I2PTunnel implements Logging, EventDispatcher {
_tunnelId = ++__tunnelId;
_log = _context.logManager().getLog(I2PTunnel.class);
_event = new EventDispatcherImpl();
_clientOptions = new Properties();
_clientOptions.putAll(System.getProperties());
Properties p = new Properties();
p.putAll(System.getProperties());
_clientOptions = p;
_sessions = new ArrayList(1);
addConnectionEventListener(lsnr);
@@ -1146,6 +1147,8 @@ public class I2PTunnel implements Logging, EventDispatcher {
}
private String getPrefix() { return '[' + _tunnelId + "]: "; }
public I2PAppContext getContext() { return _context; }
/**
* Call this whenever we lose touch with the router involuntarily (aka the router

View File

@@ -101,6 +101,10 @@ public abstract class I2PTunnelClientBase extends I2PTunnelTask implements Runna
this.l = l;
this.handlerName = handlerName + _clientId;
// no need to load the netDb with leaseSets for destinations that will never
// be looked up
tunnel.getClientOptions().setProperty("i2cp.dontPublishLeaseSet", "true");
while (sockMgr == null) {
synchronized (sockLock) {
if (ownDest) {
@@ -110,7 +114,7 @@ public abstract class I2PTunnelClientBase extends I2PTunnelTask implements Runna
}
}
if (sockMgr == null) {
_log.log(Log.CRIT, "Unable to create socket manager");
_log.log(Log.CRIT, "Unable to create socket manager (our own? " + ownDest + ")");
try { Thread.sleep(10*1000); } catch (InterruptedException ie) {}
}
}

View File

@@ -35,85 +35,68 @@ public class I2PTunnelHTTPServer extends I2PTunnelServer {
public I2PTunnelHTTPServer(InetAddress host, int port, String privData, String spoofHost, Logging l, EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host, port, privData, l, notifyThis, tunnel);
_spoofHost = spoofHost;
getTunnel().getContext().statManager().createRateStat("i2ptunnel.httpserver.blockingHandleTime", "how long the blocking handle takes to complete", "I2PTunnel.HTTPServer", new long[] { 60*1000, 10*60*1000, 3*60*60*1000 });
}
public I2PTunnelHTTPServer(InetAddress host, int port, File privkey, String privkeyname, String spoofHost, Logging l, EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host, port, privkey, privkeyname, l, notifyThis, tunnel);
_spoofHost = spoofHost;
getTunnel().getContext().statManager().createRateStat("i2ptunnel.httpserver.blockingHandleTime", "how long the blocking handle takes to complete", "I2PTunnel.HTTPServer", new long[] { 60*1000, 10*60*1000, 3*60*60*1000 });
}
public I2PTunnelHTTPServer(InetAddress host, int port, InputStream privData, String privkeyname, String spoofHost, Logging l, EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host, port, privData, privkeyname, l, notifyThis, tunnel);
_spoofHost = spoofHost;
_spoofHost = spoofHost;
getTunnel().getContext().statManager().createRateStat("i2ptunnel.httpserver.blockingHandleTime", "how long the blocking handle takes to complete", "I2PTunnel.HTTPServer", new long[] { 60*1000, 10*60*1000, 3*60*60*1000 });
}
public void run() {
try {
I2PServerSocket i2pss = sockMgr.getServerSocket();
while (true) {
I2PSocket i2ps = i2pss.accept();
if (i2ps == null) throw new I2PException("I2PServerSocket closed");
I2PThread t = new I2PThread(new Handler(i2ps));
t.start();
}
} catch (I2PException ex) {
_log.error("Error while waiting for I2PConnections", ex);
} catch (IOException ex) {
_log.error("Error while waiting for I2PConnections", ex);
}
}
/**
* Async handler to keep .accept() from blocking too long.
* todo: replace with a thread pool so we dont get overrun by threads if/when
* receiving a lot of connection requests concurrently.
* Called by the thread pool of I2PSocket handlers
*
*/
private class Handler implements Runnable {
private I2PSocket _handleSocket;
public Handler(I2PSocket socket) {
_handleSocket = socket;
}
public void run() {
long afterAccept = I2PAppContext.getGlobalContext().clock().now();
long afterSocket = -1;
//local is fast, so synchronously. Does not need that many
//threads.
protected void blockingHandle(I2PSocket socket) {
long afterAccept = getTunnel().getContext().clock().now();
long afterSocket = -1;
//local is fast, so synchronously. Does not need that many
//threads.
try {
// give them 5 seconds to send in the HTTP request
socket.setReadTimeout(5*1000);
String modifiedHeader = getModifiedHeader(socket);
if (_log.shouldLog(Log.DEBUG))
_log.debug("Modified header: [" + modifiedHeader + "]");
socket.setReadTimeout(readTimeout);
Socket s = new Socket(remoteHost, remotePort);
afterSocket = getTunnel().getContext().clock().now();
new I2PTunnelRunner(s, socket, slock, null, modifiedHeader.getBytes(), null);
} catch (SocketException ex) {
try {
_handleSocket.setReadTimeout(readTimeout);
String modifiedHeader = getModifiedHeader();
if (_log.shouldLog(Log.DEBUG))
_log.debug("Modified header: [" + modifiedHeader + "]");
Socket s = new Socket(remoteHost, remotePort);
afterSocket = I2PAppContext.getGlobalContext().clock().now();
new I2PTunnelRunner(s, _handleSocket, slock, null, modifiedHeader.getBytes(), null);
} catch (SocketException ex) {
try {
_handleSocket.close();
} catch (IOException ioe) {
socket.close();
} catch (IOException ioe) {
if (_log.shouldLog(Log.ERROR))
_log.error("Error while closing the received i2p con", ex);
}
} catch (IOException ex) {
_log.error("Error while waiting for I2PConnections", ex);
}
long afterHandle = I2PAppContext.getGlobalContext().clock().now();
long timeToHandle = afterHandle - afterAccept;
if (timeToHandle > 1000)
_log.warn("Took a while to handle the request [" + timeToHandle + ", socket create: "
+ (afterSocket-afterAccept) + "]");
}
private String getModifiedHeader() throws IOException {
InputStream in = _handleSocket.getInputStream();
StringBuffer command = new StringBuffer(128);
Properties headers = readHeaders(in, command);
headers.setProperty("Host", _spoofHost);
headers.setProperty("Connection", "close");
return formatHeaders(headers, command);
} catch (IOException ex) {
if (_log.shouldLog(Log.WARN))
_log.warn("Error while receiving the new HTTP request", ex);
}
long afterHandle = getTunnel().getContext().clock().now();
long timeToHandle = afterHandle - afterAccept;
getTunnel().getContext().statManager().addRateData("i2ptunnel.httpserver.blockingHandleTime", timeToHandle, 0);
if ( (timeToHandle > 1000) && (_log.shouldLog(Log.WARN)) )
_log.warn("Took a while to handle the request [" + timeToHandle + ", socket create: " + (afterSocket-afterAccept) + "]");
}
private String getModifiedHeader(I2PSocket handleSocket) throws IOException {
InputStream in = handleSocket.getInputStream();
StringBuffer command = new StringBuffer(128);
Properties headers = readHeaders(in, command);
headers.setProperty("Host", _spoofHost);
headers.setProperty("Connection", "close");
return formatHeaders(headers, command);
}
private String formatHeaders(Properties headers, StringBuffer command) {

View File

@@ -11,6 +11,7 @@ import java.io.InputStream;
import java.net.InetAddress;
import java.net.Socket;
import java.net.SocketException;
import java.net.ConnectException;
import java.util.Iterator;
import java.util.Properties;
@@ -39,6 +40,7 @@ public class I2PTunnelServer extends I2PTunnelTask implements Runnable {
protected InetAddress remoteHost;
protected int remotePort;
private boolean _usePool;
private Logging l;
@@ -46,15 +48,27 @@ public class I2PTunnelServer extends I2PTunnelTask implements Runnable {
/** default timeout to 3 minutes - override if desired */
protected long readTimeout = DEFAULT_READ_TIMEOUT;
private static final boolean DEFAULT_USE_POOL = false;
public I2PTunnelServer(InetAddress host, int port, String privData, Logging l, EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host + ":" + port + " <- " + privData, notifyThis, tunnel);
ByteArrayInputStream bais = new ByteArrayInputStream(Base64.decode(privData));
String usePool = tunnel.getClientOptions().getProperty("i2ptunnel.usePool");
if (usePool != null)
_usePool = "true".equalsIgnoreCase(usePool);
else
_usePool = DEFAULT_USE_POOL;
init(host, port, bais, privData, l);
}
public I2PTunnelServer(InetAddress host, int port, File privkey, String privkeyname, Logging l,
EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host + ":" + port + " <- " + privkeyname, notifyThis, tunnel);
String usePool = tunnel.getClientOptions().getProperty("i2ptunnel.usePool");
if (usePool != null)
_usePool = "true".equalsIgnoreCase(usePool);
else
_usePool = DEFAULT_USE_POOL;
try {
init(host, port, new FileInputStream(privkey), privkeyname, l);
} catch (IOException ioe) {
@@ -65,6 +79,11 @@ public class I2PTunnelServer extends I2PTunnelTask implements Runnable {
public I2PTunnelServer(InetAddress host, int port, InputStream privData, String privkeyname, Logging l, EventDispatcher notifyThis, I2PTunnel tunnel) {
super(host + ":" + port + " <- " + privkeyname, notifyThis, tunnel);
String usePool = tunnel.getClientOptions().getProperty("i2ptunnel.usePool");
if (usePool != null)
_usePool = "true".equalsIgnoreCase(usePool);
else
_usePool = DEFAULT_USE_POOL;
init(host, port, privData, privkeyname, l);
}
@@ -159,58 +178,103 @@ public class I2PTunnelServer extends I2PTunnelTask implements Runnable {
}
}
private static final String PROP_HANDLER_COUNT = "i2ptunnel.blockingHandlerCount";
private static final int DEFAULT_HANDLER_COUNT = 10;
protected int getHandlerCount() {
int rv = DEFAULT_HANDLER_COUNT;
String cnt = getTunnel().getClientOptions().getProperty(PROP_HANDLER_COUNT);
if (cnt != null) {
try {
rv = Integer.parseInt(cnt);
if (rv <= 0)
rv = DEFAULT_HANDLER_COUNT;
} catch (NumberFormatException nfe) {
rv = DEFAULT_HANDLER_COUNT;
}
}
return rv;
}
public void run() {
try {
if (shouldUsePool()) {
I2PServerSocket i2pss = sockMgr.getServerSocket();
int handlers = getHandlerCount();
for (int i = 0; i < handlers; i++) {
I2PThread handler = new I2PThread(new Handler(i2pss), "Handle Server " + i);
handler.start();
}
} else {
I2PServerSocket i2pss = sockMgr.getServerSocket();
while (true) {
I2PSocket i2ps = i2pss.accept();
if (i2ps == null) throw new I2PException("I2PServerSocket closed");
I2PThread t = new I2PThread(new Handler(i2ps));
t.start();
try {
final I2PSocket i2ps = i2pss.accept();
if (i2ps == null) throw new I2PException("I2PServerSocket closed");
new I2PThread(new Runnable() { public void run() { blockingHandle(i2ps); } }).start();
} catch (I2PException ipe) {
if (_log.shouldLog(Log.ERROR))
_log.error("Error accepting - KILLING THE TUNNEL SERVER", ipe);
return;
} catch (ConnectException ce) {
if (_log.shouldLog(Log.ERROR))
_log.error("Error accepting", ce);
// not killing the server..
}
}
} catch (I2PException ex) {
_log.error("Error while waiting for I2PConnections", ex);
} catch (IOException ex) {
_log.error("Error while waiting for I2PConnections", ex);
}
}
public boolean shouldUsePool() { return _usePool; }
/**
* Async handler to keep .accept() from blocking too long.
* todo: replace with a thread pool so we dont get overrun by threads if/when
* receiving a lot of connection requests concurrently.
* minor thread pool to pull off the accept() concurrently. there are still lots
* (and lots) of wasted threads within the I2PTunnelRunner, but its a start
*
*/
private class Handler implements Runnable {
private I2PSocket _handleSocket;
public Handler(I2PSocket socket) {
_handleSocket = socket;
private I2PServerSocket _serverSocket;
public Handler(I2PServerSocket serverSocket) {
_serverSocket = serverSocket;
}
public void run() {
long afterAccept = I2PAppContext.getGlobalContext().clock().now();
long afterSocket = -1;
//local is fast, so synchronously. Does not need that many
//threads.
try {
_handleSocket.setReadTimeout(readTimeout);
Socket s = new Socket(remoteHost, remotePort);
afterSocket = I2PAppContext.getGlobalContext().clock().now();
new I2PTunnelRunner(s, _handleSocket, slock, null, null);
} catch (SocketException ex) {
while (open) {
try {
_handleSocket.close();
} catch (IOException ioe) {
_log.error("Error while closing the received i2p con", ex);
blockingHandle(_serverSocket.accept());
} catch (I2PException ex) {
_log.error("Error while waiting for I2PConnections", ex);
return;
} catch (IOException ex) {
_log.error("Error while waiting for I2PConnections", ex);
return;
}
} catch (IOException ex) {
_log.error("Error while waiting for I2PConnections", ex);
}
long afterHandle = I2PAppContext.getGlobalContext().clock().now();
long timeToHandle = afterHandle - afterAccept;
if (timeToHandle > 1000)
_log.warn("Took a while to handle the request [" + timeToHandle + ", socket create: " + (afterSocket-afterAccept) + "]");
}
}
protected void blockingHandle(I2PSocket socket) {
long afterAccept = I2PAppContext.getGlobalContext().clock().now();
long afterSocket = -1;
//local is fast, so synchronously. Does not need that many
//threads.
try {
socket.setReadTimeout(readTimeout);
Socket s = new Socket(remoteHost, remotePort);
afterSocket = I2PAppContext.getGlobalContext().clock().now();
new I2PTunnelRunner(s, socket, slock, null, null);
} catch (SocketException ex) {
try {
socket.close();
} catch (IOException ioe) {
_log.error("Error while closing the received i2p con", ex);
}
} catch (IOException ex) {
_log.error("Error while waiting for I2PConnections", ex);
}
long afterHandle = I2PAppContext.getGlobalContext().clock().now();
long timeToHandle = afterHandle - afterAccept;
if (timeToHandle > 1000)
_log.warn("Took a while to handle the request [" + timeToHandle + ", socket create: " + (afterSocket-afterAccept) + "]");
}
}

View File

@@ -1,445 +0,0 @@
/* I2PTunnel is GPL'ed (with the exception mentioned in I2PTunnel.java)
* (c) 2003 - 2004 mihi
*/
package net.i2p.i2ptunnel;
import java.io.IOException;
import java.io.OutputStream;
import java.net.InetAddress;
import java.net.ServerSocket;
import java.net.Socket;
import net.i2p.data.DataFormatException;
import net.i2p.data.Destination;
import net.i2p.util.Clock;
import net.i2p.util.I2PThread;
import net.i2p.util.Log;
/**
* Quick and dirty socket listener to control an I2PTunnel.
* Basically run this class as TunnelManager [listenHost] [listenPort] and
* then send it commands on that port. Commands are one shot deals -
* Send a command + newline, get a response plus newline, then get disconnected.
* <p />
* <b>Implemented commands:</b>
* <pre>
* -------------------------------------------------
* lookup &lt;name&gt;\n
* --
* &lt;base64 of the destination&gt;\n
* or
* &lt;error message, usually 'Unknown host'&gt;\n
*
* Lookup the public key of a named destination (i.e. listed in hosts.txt)
* -------------------------------------------------
* genkey\n
* --
* &lt;base64 of the destination&gt;\t&lt;base64 of private data&gt;\n
*
* Generates a new public and private key pair
* -------------------------------------------------
* convertprivate &lt;base64 of privkey&gt;
* --
* &lt;base64 of destination&gt;\n
* or
* &lt;error message&gt;\n
*
* Returns the destination (pubkey) of a given private key.
* -------------------------------------------------
* listen_on &lt;ip&gt;\n
* --
* ok\n
* or
* error\n
*
* Sets the ip address clients will listen on. By default this is the
* localhost (127.0.0.1)
* -------------------------------------------------
* openclient &lt;listenPort&gt; &lt;peer&gt;[ &lt;sharedClient&gt;]\n
* --
* ok [&lt;jobId&gt;]\n
* or
* ok &lt;listenPort&gt; [&lt;jobId&gt;]\n
* or
* error\n
*
* Open a tunnel on the given &lt;listenport&gt; to the destination specified
* by &lt;peer&gt;. If &lt;listenPort&gt; is 0 a free port is picked and returned in
* the reply message. Otherwise the short reply message is used.
* Peer can be the base64 of the destination, a file with the public key
* specified as 'file:&lt;filename&gt;' or the name of a destination listed in
* hosts.txt. The &lt;jobId&gt; returned together with "ok" and &lt;listenport&gt; can
* later be used as argument for the "close" command.
* &lt;sharedClient&gt; indicates if this httpclient shares tunnels with other
* clients or not (just use 'true' and 'false'
* -------------------------------------------------
* openhttpclient &lt;listenPort&gt; [&lt;sharedClient&gt;] [&lt;proxy&gt;]\n
* --
* ok [&lt;jobId&gt;]\n
* or
* ok &lt;listenPort&gt; [&lt;jobId&gt;]\n
* or
* error\n
*
* Open an HTTP proxy through the I2P on the given
* &lt;listenport&gt;. &lt;proxy&gt; (optional) specifies a
* destination to be used as an outbound proxy, to access normal WWW
* sites out of the .i2p domain. If &lt;listenPort&gt; is 0 a free
* port is picked and returned in the reply message. Otherwise the
* short reply message is used. &lt;proxy&gt; can be the base64 of the
* destination, a file with the public key specified as
* 'file:&lt;filename&gt;' or the name of a destination listed in
* hosts.txt. The &lt;jobId&gt; returned together with "ok" and
* &lt;listenport&gt; can later be used as argument for the "close"
* command.
* &lt;sharedClient&gt; indicates if this httpclient shares tunnels with other
* clients or not (just use 'true' and 'false'
* -------------------------------------------------
* opensockstunnel &lt;listenPort&gt;\n
* --
* ok [&lt;jobId&gt;]\n
* or
* ok &lt;listenPort&gt; [&lt;jobId&gt;]\n
* or
* error\n
*
* Open an SOCKS tunnel through the I2P on the given
* &lt;listenport&gt;. If &lt;listenPort&gt; is 0 a free port is
* picked and returned in the reply message. Otherwise the short
* reply message is used. The &lt;jobId&gt; returned together with
* "ok" and &lt;listenport&gt; can later be used as argument for the
* "close" command.
* -------------------------------------------------
* openserver &lt;serverHost&gt; &lt;serverPort&gt; &lt;serverKeys&gt;\n
* --
* ok [&lt;jobId&gt;]\n
* or
* error\n
*
* Starts receiving traffic for the destination specified by &lt;serverKeys&gt;
* and forwards it to the &lt;serverPort&gt; of &lt;serverHost&gt;.
* &lt;serverKeys&gt; is the base 64 encoded private key set of the local
* destination. The &lt;joId&gt; returned together with "ok" can later be used
* as argument for the "close" command.
* -------------------------------------------------
* close [forced] &lt;jobId&gt;\n
* or
* close [forced] all\n
* --
* ok\n
* or
* error\n
*
* Closes the job specified by &lt;jobId&gt; or all jobs. Use the list command
* for a list of running jobs.
* Normally a connection job is not closed when it still has an active
* connection. Use the optional 'forced' keyword to close connections
* regardless of their use.
* -------------------------------------------------
* list\n
* --
* Example output:
*
* [0] i2p.dnsalias.net/69.55.226.145:5555 &lt;- C:\i2pKeys\squidPriv
* [1] 8767 -&gt; HTTPClient
* [2] 7575 -&gt; file:C:\i2pKeys\squidPub
* [3] 5252 -&gt; sCcSANIO~f4AQtCNI1BvDp3ZBS~9Ag5O0k0Msm7XBWWz5eOnZWL3MQ-2rxlesucb9XnpASGhWzyYNBpWAfaIB3pux1J1xujQLOwscMIhm7T8BP76Ly5jx6BLZCYrrPj0BI0uV90XJyT~4UyQgUlC1jzFQdZ9HDgBPJDf1UI4-YjIwEHuJgdZynYlQ1oUFhgno~HhcDByXO~PDaO~1JDMDbBEfIh~v6MgmHp-Xchod1OfKFrxFrzHgcJbn7E8edTFjZA6JCi~DtFxFelQz1lSBd-QB1qJnA0g-pVL5qngNUojXJCXs4qWcQ7ICLpvIc-Fpfj-0F1gkVlGDSGkb1yLH3~8p4czYgR3W5D7OpwXzezz6clpV8kmbd~x2SotdWsXBPRhqpewO38coU4dJG3OEUbuYmdN~nJMfWbmlcM1lXzz2vBsys4sZzW6dV3hZnbvbfxNTqbdqOh-KXi1iAzXv7CVTun0ubw~CfeGpcAqutC5loRUq7Mq62ngOukyv8Z9AAAA
*
* Lists descriptions of all running jobs. The exact format of the
* description depends on the type of job.
* -------------------------------------------------
* </pre>
*
*
* @deprecated this isn't run by default, and no one seems to use it, and has
* lots of things to maintain. so, at some point this may dissapear
* unless someone pipes up ;)
*/
public class TunnelManager implements Runnable {
private final static Log _log = new Log(TunnelManager.class);
private I2PTunnel _tunnel;
private ServerSocket _socket;
private boolean _keepAccepting;
public TunnelManager(int listenPort) {
this(null, listenPort);
}
public TunnelManager(String listenHost, int listenPort) {
_tunnel = new I2PTunnel();
_keepAccepting = true;
try {
if (listenHost != null) {
_socket = new ServerSocket(listenPort, 0, InetAddress.getByName(listenHost));
_log.info("Listening for tunnel management clients on " + listenHost + ":" + listenPort);
} else {
_socket = new ServerSocket(listenPort);
_log.info("Listening for tunnel management clients on localhost:" + listenPort);
}
} catch (Exception e) {
_log.error("Error starting up tunnel management listener on " + listenPort, e);
}
}
public static void main(String args[]) {
int port = 7676;
String host = null;
if (args.length == 1) {
try {
port = Integer.parseInt(args[0]);
} catch (NumberFormatException nfe) {
_log.error("Usage: TunnelManager [host] [port]");
return;
}
} else if (args.length == 2) {
host = args[0];
try {
port = Integer.parseInt(args[1]);
} catch (NumberFormatException nfe) {
_log.error("Usage: TunnelManager [host] [port]");
return;
}
}
TunnelManager mgr = new TunnelManager(host, port);
Thread t = new I2PThread(mgr, "Listener");
t.start();
}
public void run() {
if (_socket == null) {
_log.error("Unable to start listening, since the socket was not bound. Already running?");
return;
}
_log.debug("Running");
try {
while (_keepAccepting) {
Socket socket = _socket.accept();
_log.debug("Client accepted");
if (socket != null) {
Thread t = new I2PThread(new TunnelManagerClientRunner(this, socket));
t.setName("TunnelManager Client");
t.setPriority(I2PThread.MIN_PRIORITY);
t.start();
}
}
} catch (IOException ioe) {
_log.error("Error accepting connections", ioe);
} catch (Exception e) {
_log.error("Other error?!", e);
} finally {
if (_socket != null) try {
_socket.close();
} catch (IOException ioe) {
}
}
try {
Thread.sleep(5000);
} catch (InterruptedException ie) {
}
}
public void error(String msg, OutputStream out) throws IOException {
out.write(msg.getBytes());
out.write('\n');
}
public void processQuit(OutputStream out) throws IOException {
out.write("Nice try".getBytes());
out.write('\n');
}
public void processList(OutputStream out) throws IOException {
BufferLogger buf = new BufferLogger();
long startCommand = Clock.getInstance().now();
_tunnel.runCommand("list", buf);
Object obj = _tunnel.waitEventValue("listDone");
long endCommand = Clock.getInstance().now();
String str = buf.getBuffer();
_log.debug("ListDone complete after " + (endCommand - startCommand) + "ms: [" + str + "]");
out.write(str.getBytes());
out.write('\n');
buf.ignoreFurtherActions();
}
public void processListenOn(String ip, OutputStream out) throws IOException {
BufferLogger buf = new BufferLogger();
_tunnel.runCommand("listen_on " + ip, buf);
String status = (String) _tunnel.waitEventValue("listen_onResult");
out.write((status + "\n").getBytes());
buf.ignoreFurtherActions();
}
/**
* "lookup <name>" returns with the result in base64, else "Unknown host" [or something like that],
* then a newline.
*
*/
public void processLookup(String name, OutputStream out) throws IOException {
BufferLogger buf = new BufferLogger();
_tunnel.runCommand("lookup " + name, buf);
String rv = (String) _tunnel.waitEventValue("lookupResult");
out.write(rv.getBytes());
out.write('\n');
buf.ignoreFurtherActions();
}
public void processTestDestination(String destKey, OutputStream out) throws IOException {
try {
Destination d = new Destination();
d.fromBase64(destKey);
out.write("valid\n".getBytes());
} catch (DataFormatException dfe) {
out.write("invalid\n".getBytes());
}
out.flush();
}
public void processConvertPrivate(String priv, OutputStream out) throws IOException {
try {
Destination dest = new Destination();
dest.fromBase64(priv);
String str = dest.toBase64();
out.write(str.getBytes());
out.write('\n');
} catch (DataFormatException dfe) {
_log.error("Error converting private data", dfe);
out.write("Error converting private key\n".getBytes());
}
}
public void processClose(String which, boolean forced, OutputStream out) throws IOException {
BufferLogger buf = new BufferLogger();
_tunnel.runCommand((forced ? "close forced " : "close ") + which, buf);
String str = (String) _tunnel.waitEventValue("closeResult");
out.write((str + "\n").getBytes());
buf.ignoreFurtherActions();
}
/**
* "genkey" returns with the base64 of the destination, followed by a tab, then the base64 of that
* destination's private keys, then a newline.
*
*/
public void processGenKey(OutputStream out) throws IOException {
BufferLogger buf = new BufferLogger();
_tunnel.runCommand("gentextkeys", buf);
String priv = (String) _tunnel.waitEventValue("privateKey");
String pub = (String) _tunnel.waitEventValue("publicDestination");
out.write((pub + "\t" + priv).getBytes());
out.write('\n');
buf.ignoreFurtherActions();
}
public void processOpenClient(int listenPort, String peer, String sharedClient, OutputStream out) throws IOException {
BufferLogger buf = new BufferLogger();
_tunnel.runCommand("client " + listenPort + " " + peer + " " + sharedClient, buf);
Integer taskId = (Integer) _tunnel.waitEventValue("clientTaskId");
if (taskId.intValue() < 0) {
out.write("error\n".getBytes());
buf.ignoreFurtherActions();
return;
}
String rv = (String) _tunnel.waitEventValue("openClientResult");
if (rv.equals("error")) {
out.write((rv + "\n").getBytes());
buf.ignoreFurtherActions();
return;
}
if (listenPort != 0) {
out.write((rv + " [" + taskId.intValue() + "]\n").getBytes());
buf.ignoreFurtherActions();
return;
}
Integer port = (Integer) _tunnel.waitEventValue("clientLocalPort");
out.write((rv + " " + port.intValue() + " [" + taskId.intValue() + "]\n").getBytes());
buf.ignoreFurtherActions();
}
public void processOpenHTTPClient(int listenPort, String sharedClient, String proxy, OutputStream out) throws IOException {
BufferLogger buf = new BufferLogger();
_tunnel.runCommand("httpclient " + listenPort + " " + sharedClient + " " + proxy, buf);
Integer taskId = (Integer) _tunnel.waitEventValue("httpclientTaskId");
if (taskId.intValue() < 0) {
out.write("error\n".getBytes());
buf.ignoreFurtherActions();
return;
}
String rv = (String) _tunnel.waitEventValue("openHTTPClientResult");
if (rv.equals("error")) {
out.write((rv + "\n").getBytes());
buf.ignoreFurtherActions();
return;
}
if (listenPort != 0) {
out.write((rv + " [" + taskId.intValue() + "]\n").getBytes());
buf.ignoreFurtherActions();
return;
}
Integer port = (Integer) _tunnel.waitEventValue("clientLocalPort");
out.write((rv + " " + port.intValue() + " [" + taskId.intValue() + "]\n").getBytes());
buf.ignoreFurtherActions();
}
public void processOpenSOCKSTunnel(int listenPort, OutputStream out) throws IOException {
BufferLogger buf = new BufferLogger();
_tunnel.runCommand("sockstunnel " + listenPort, buf);
Integer taskId = (Integer) _tunnel.waitEventValue("sockstunnelTaskId");
if (taskId.intValue() < 0) {
out.write("error\n".getBytes());
buf.ignoreFurtherActions();
return;
}
String rv = (String) _tunnel.waitEventValue("openSOCKSTunnelResult");
if (rv.equals("error")) {
out.write((rv + "\n").getBytes());
buf.ignoreFurtherActions();
return;
}
if (listenPort != 0) {
out.write((rv + " [" + taskId.intValue() + "]\n").getBytes());
buf.ignoreFurtherActions();
return;
}
Integer port = (Integer) _tunnel.waitEventValue("clientLocalPort");
out.write((rv + " " + port.intValue() + " [" + taskId.intValue() + "]\n").getBytes());
buf.ignoreFurtherActions();
}
public void processOpenServer(String serverHost, int serverPort, String privateKeys, OutputStream out)
throws IOException {
BufferLogger buf = new BufferLogger();
_tunnel.runCommand("textserver " + serverHost + " " + serverPort + " " + privateKeys, buf);
Integer taskId = (Integer) _tunnel.waitEventValue("serverTaskId");
if (taskId.intValue() < 0) {
out.write("error\n".getBytes());
buf.ignoreFurtherActions();
return;
}
String rv = (String) _tunnel.waitEventValue("openServerResult");
if (rv.equals("error")) {
out.write((rv + "\n").getBytes());
buf.ignoreFurtherActions();
return;
}
out.write((rv + " [" + taskId.intValue() + "]\n").getBytes());
buf.ignoreFurtherActions();
}
/**
* Frisbee.
*
*/
public void unknownCommand(String command, OutputStream out) throws IOException {
out.write("Unknown command: ".getBytes());
out.write(command.getBytes());
out.write("\n".getBytes());
}
}

View File

@@ -1,203 +0,0 @@
/* I2PTunnel is GPL'ed (with the exception mentioned in I2PTunnel.java)
* (c) 2003 - 2004 mihi
*/
package net.i2p.i2ptunnel;
import java.io.BufferedReader;
import java.io.IOException;
import java.io.InputStreamReader;
import java.io.OutputStream;
import java.net.Socket;
import java.util.StringTokenizer;
import net.i2p.util.Log;
/**
* Runner thread that reads commands from the socket and fires off commands to
* the TunnelManager
*
*/
class TunnelManagerClientRunner implements Runnable {
private final static Log _log = new Log(TunnelManagerClientRunner.class);
private TunnelManager _mgr;
private Socket _clientSocket;
public TunnelManagerClientRunner(TunnelManager mgr, Socket socket) {
_clientSocket = socket;
_mgr = mgr;
}
public void run() {
_log.debug("Client running");
try {
BufferedReader reader = new BufferedReader(new InputStreamReader(_clientSocket.getInputStream()));
OutputStream out = _clientSocket.getOutputStream();
String cmd = reader.readLine();
if (cmd != null) processCommand(cmd, out);
} catch (IOException ioe) {
_log.error("Error processing client commands", ioe);
} finally {
if (_clientSocket != null) try {
_clientSocket.close();
} catch (IOException ioe) {
}
}
_log.debug("Client closed");
}
/**
* Parse the command string and fire off the appropriate tunnelManager method,
* sending the results to the output stream
*/
private void processCommand(String command, OutputStream out) throws IOException {
_log.debug("Processing [" + command + "]");
StringTokenizer tok = new StringTokenizer(command);
if (!tok.hasMoreTokens()) {
_mgr.unknownCommand(command, out);
} else {
String cmd = tok.nextToken();
if ("quit".equalsIgnoreCase(cmd)) {
_mgr.processQuit(out);
} else if ("lookup".equalsIgnoreCase(cmd)) {
if (tok.hasMoreTokens())
_mgr.processLookup(tok.nextToken(), out);
else
_mgr.error("Usage: lookup <hostname>", out);
} else if ("testdestination".equalsIgnoreCase(cmd)) {
if (tok.hasMoreTokens())
_mgr.processTestDestination(tok.nextToken(), out);
else
_mgr.error("Usage: testdestination <publicDestination>", out);
} else if ("convertprivate".equalsIgnoreCase(cmd)) {
if (tok.hasMoreTokens())
_mgr.processConvertPrivate(tok.nextToken(), out);
else
_mgr.error("Usage: convertprivate <privateData>", out);
} else if ("close".equalsIgnoreCase(cmd)) {
if (tok.hasMoreTokens()) {
String closeArg;
if ((closeArg = tok.nextToken()).equals("forced")) {
if (tok.hasMoreTokens()) {
_mgr.processClose(tok.nextToken(), true, out);
} else {
_mgr.error("Usage: close [forced] <jobnumber>|all", out);
}
} else {
_mgr.processClose(closeArg, false, out);
}
} else {
_mgr.error("Usage: close [forced] <jobnumber>|all", out);
}
} else if ("genkey".equalsIgnoreCase(cmd)) {
_mgr.processGenKey(out);
} else if ("list".equalsIgnoreCase(cmd)) {
_mgr.processList(out);
} else if ("listen_on".equalsIgnoreCase(cmd)) {
if (tok.hasMoreTokens()) {
_mgr.processListenOn(tok.nextToken(), out);
} else {
_mgr.error("Usage: listen_on <ip>", out);
}
} else if ("openclient".equalsIgnoreCase(cmd)) {
int listenPort = 0;
String peer = null;
String sharedClient = null;
int numTokens = tok.countTokens();
if (numTokens < 2 || numTokens > 3) {
_mgr.error("Usage: openclient <listenPort> <peer> <sharedClient>", out);
return;
}
try {
listenPort = Integer.parseInt(tok.nextToken());
peer = tok.nextToken();
if (tok.hasMoreTokens())
sharedClient = tok.nextToken();
else
sharedClient = "true";
_mgr.processOpenClient(listenPort, peer, sharedClient, out);
} catch (NumberFormatException nfe) {
_mgr.error("Bad listen port", out);
return;
}
} else if ("openhttpclient".equalsIgnoreCase(cmd)) {
int listenPort = 0;
String proxy = "squid.i2p";
String sharedClient = "true";
int numTokens = tok.countTokens();
if (numTokens < 1 || numTokens > 3) {
_mgr.error("Usage: openhttpclient <listenPort> [<sharedClient>] [<proxy>]", out);
return;
}
try {
listenPort = Integer.parseInt(tok.nextToken());
if (tok.hasMoreTokens()) {
String val = tok.nextToken();
if (tok.hasMoreTokens()) {
sharedClient = val;
proxy = tok.nextToken();
} else {
if ( ("true".equals(val)) || ("false".equals(val)) ) {
sharedClient = val;
} else {
proxy = val;
}
}
}
_mgr.processOpenHTTPClient(listenPort, sharedClient, proxy, out);
} catch (NumberFormatException nfe) {
_mgr.error("Bad listen port", out);
return;
}
} else if ("opensockstunnel".equalsIgnoreCase(cmd)) {
int listenPort = 0;
if (!tok.hasMoreTokens()) {
_mgr.error("Usage: opensockstunnel <listenPort>", out);
return;
}
try {
String portStr = tok.nextToken();
listenPort = Integer.parseInt(portStr);
} catch (NumberFormatException nfe) {
_mgr.error("Bad listen port", out);
return;
}
if (tok.hasMoreTokens()) {
_mgr.error("Usage: opensockstunnel <listenport>", out);
return;
}
_mgr.processOpenSOCKSTunnel(listenPort, out);
} else if ("openserver".equalsIgnoreCase(cmd)) {
int listenPort = 0;
String serverHost = null;
String serverKeys = null;
if (!tok.hasMoreTokens()) {
_mgr.error("Usage: openserver <serverHost> <serverPort> <serverKeys>", out);
return;
}
serverHost = tok.nextToken();
if (!tok.hasMoreTokens()) {
_mgr.error("Usage: openserver <serverHost> <serverPort> <serverKeys>", out);
return;
}
try {
String portStr = tok.nextToken();
listenPort = Integer.parseInt(portStr);
} catch (NumberFormatException nfe) {
_mgr.error("Bad listen port", out);
return;
}
if (!tok.hasMoreTokens()) {
_mgr.error("Usage: openserver <serverHost> <serverPort> <serverKeys>", out);
return;
}
serverKeys = tok.nextToken();
_mgr.processOpenServer(serverHost, listenPort, serverKeys, out);
} else {
_mgr.unknownCommand(command, out);
}
}
}
}

View File

@@ -93,7 +93,7 @@ if (curTunnel >= 0) {
</select>
&nbsp;&nbsp;
<b>others:</b>
<input type="text" name="reachablyByOther" size="20" />
<input type="text" name="reachableByOther" size="20" />
<% } else if ("0.0.0.0".equals(clientInterface)) { %>
<option value="127.0.0.1">Locally (127.0.0.1)</option>
<option value="0.0.0.0" selected="true">Everyone (0.0.0.0)</option>
@@ -102,7 +102,7 @@ if (curTunnel >= 0) {
</select>
&nbsp;&nbsp;
<b>others:</b>
<input type="text" name="reachablyByOther" size="20" />
<input type="text" name="reachableByOther" size="20" />
<% } else { %>
<option value="127.0.0.1">Locally (127.0.0.1)</option>
<option value="0.0.0.0">Everyone (0.0.0.0)</option>

View File

@@ -1,4 +1,4 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="ISO-8859-1"?>
<!DOCTYPE web-app
PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.2//EN"
"http://java.sun.com/j2ee/dtds/web-app_2.2.dtd">
@@ -14,4 +14,4 @@
<welcome-file>index.html</welcome-file>
<welcome-file>index.jsp</welcome-file>
</welcome-file-list>
</web-app>
</web-app>

View File

@@ -15,6 +15,7 @@ import net.i2p.I2PAppContext;
import net.i2p.I2PException;
import net.i2p.data.Destination;
import net.i2p.data.DataFormatException;
import net.i2p.util.I2PThread;
import net.i2p.util.Log;
/**
@@ -74,61 +75,67 @@ public class StreamSinkClient {
} finally {
if (fis == null) try { fis.close(); } catch (IOException ioe) {}
}
System.out.println("Send " + _sendSize + "KB to " + peer.calculateHash().toBase64());
try {
I2PSocket sock = mgr.connect(peer);
byte buf[] = new byte[32*1024];
Random rand = new Random();
OutputStream out = sock.getOutputStream();
long beforeSending = System.currentTimeMillis();
for (int i = 0; i < _sendSize; i+= 32) {
rand.nextBytes(buf);
out.write(buf);
if (_log.shouldLog(Log.DEBUG))
_log.debug("Send " + _sendSize + "KB to " + peer.calculateHash().toBase64());
while (true) {
try {
I2PSocket sock = mgr.connect(peer);
byte buf[] = new byte[Math.min(32*1024, _sendSize*1024)];
Random rand = new Random();
OutputStream out = sock.getOutputStream();
long beforeSending = System.currentTimeMillis();
for (int i = 0; (_sendSize < 0) || (i < _sendSize); i+= buf.length/1024) {
rand.nextBytes(buf);
out.write(buf);
if (_log.shouldLog(Log.DEBUG))
_log.debug("Wrote " + ((1+i*buf.length)/1024) + "/" + _sendSize + "KB");
if (_writeDelay > 0) {
try { Thread.sleep(_writeDelay); } catch (InterruptedException ie) {}
}
}
sock.close();
long afterSending = System.currentTimeMillis();
if (_log.shouldLog(Log.DEBUG))
_log.debug("Wrote " + (i+32) + "/" + _sendSize + "KB");
if (_writeDelay > 0) {
try { Thread.sleep(_writeDelay); } catch (InterruptedException ie) {}
}
}
sock.close();
long afterSending = System.currentTimeMillis();
System.out.println("Sent " + _sendSize + "KB in " + (afterSending-beforeSending) + "ms");
} catch (InterruptedIOException iie) {
_log.error("Timeout connecting to the peer", iie);
return;
} catch (NoRouteToHostException nrthe) {
_log.error("Unable to connect to the peer", nrthe);
return;
} catch (ConnectException ce) {
_log.error("Connection already dropped", ce);
return;
} catch (I2PException ie) {
_log.error("Error connecting to the peer", ie);
return;
} catch (IOException ioe) {
_log.error("IO error sending", ioe);
return;
_log.debug("Sent " + _sendSize + "KB in " + (afterSending-beforeSending) + "ms");
} catch (InterruptedIOException iie) {
_log.error("Timeout connecting to the peer", iie);
//return;
} catch (NoRouteToHostException nrthe) {
_log.error("Unable to connect to the peer", nrthe);
//return;
} catch (ConnectException ce) {
_log.error("Connection already dropped", ce);
//return;
} catch (I2PException ie) {
_log.error("Error connecting to the peer", ie);
return;
} catch (IOException ioe) {
_log.error("IO error sending", ioe);
return;
}
}
}
/**
* Fire up the client. <code>Usage: StreamSinkClient [i2cpHost i2cpPort] sendSizeKB writeDelayMs serverDestFile</code> <br />
* Fire up the client. <code>Usage: StreamSinkClient [i2cpHost i2cpPort] sendSizeKB writeDelayMs serverDestFile [concurrentSends]</code> <br />
* <ul>
* <li><b>sendSizeKB</b>: how many KB to send</li>
* <li><b>sendSizeKB</b>: how many KB to send, or -1 for unlimited</li>
* <li><b>writeDelayMs</b>: how long to wait between each .write (0 for no delay)</li>
* <li><b>serverDestFile</b>: file containing the StreamSinkServer's binary Destination</li>
* <li><b>concurrentSends</b>: how many concurrent threads should send to the server at once</li>
* </ul>
*/
public static void main(String args[]) {
StreamSinkClient client = null;
int sendSizeKB = -1;
int writeDelayMs = -1;
int concurrent = 1;
switch (args.length) {
case 3:
case 3: // fall through
case 4:
try {
sendSizeKB = Integer.parseInt(args[0]);
} catch (NumberFormatException nfe) {
@@ -141,9 +148,13 @@ public class StreamSinkClient {
System.err.println("Write delay ms invalid [" + args[1] + "]");
return;
}
if (args.length == 4) {
try { concurrent = Integer.parseInt(args[3]); } catch (NumberFormatException nfe) {}
}
client = new StreamSinkClient(sendSizeKB, writeDelayMs, args[2]);
break;
case 5:
case 5: // fall through
case 6:
try {
int port = Integer.parseInt(args[1]);
sendSizeKB = Integer.parseInt(args[2]);
@@ -152,11 +163,26 @@ public class StreamSinkClient {
} catch (NumberFormatException nfe) {
System.err.println("arg error");
}
if (args.length == 6) {
try { concurrent = Integer.parseInt(args[5]); } catch (NumberFormatException nfe) {}
}
break;
default:
System.out.println("Usage: StreamSinkClient [i2cpHost i2cpPort] sendSizeKB writeDelayMs serverDestFile");
System.out.println("Usage: StreamSinkClient [i2cpHost i2cpPort] sendSizeKB writeDelayMs serverDestFile [concurrentSends]");
}
if (client != null) {
for (int i = 0; i < concurrent; i++)
new I2PThread(new Runner(client), "Client " + i).start();
}
}
private static class Runner implements Runnable {
private StreamSinkClient _client;
public Runner(StreamSinkClient client) {
_client = client;
}
public void run() {
_client.runClient();
}
if (client != null)
client.runClient();
}
}

View File

@@ -6,6 +6,8 @@ import java.io.IOException;
import java.io.InputStream;
import java.net.ConnectException;
import java.util.ArrayList;
import java.util.List;
import java.util.Properties;
import net.i2p.I2PAppContext;
@@ -26,6 +28,7 @@ public class StreamSinkServer {
private String _destFile;
private String _i2cpHost;
private int _i2cpPort;
private int _handlers;
/**
* Create but do not start the streaming server.
@@ -34,13 +37,14 @@ public class StreamSinkServer {
* @param ourDestFile filename to write our binary destination to
*/
public StreamSinkServer(String sinkDir, String ourDestFile) {
this(sinkDir, ourDestFile, null, -1);
this(sinkDir, ourDestFile, null, -1, 3);
}
public StreamSinkServer(String sinkDir, String ourDestFile, String i2cpHost, int i2cpPort) {
public StreamSinkServer(String sinkDir, String ourDestFile, String i2cpHost, int i2cpPort, int handlers) {
_sinkDir = sinkDir;
_destFile = ourDestFile;
_i2cpHost = i2cpHost;
_i2cpPort = i2cpPort;
_handlers = handlers;
_log = I2PAppContext.getGlobalContext().logManager().getLog(StreamSinkServer.class);
}
@@ -56,7 +60,8 @@ public class StreamSinkServer {
else
mgr = I2PSocketManagerFactory.createManager();
Destination dest = mgr.getSession().getMyDestination();
System.out.println("Listening for connections on: " + dest.calculateHash().toBase64());
if (_log.shouldLog(Log.INFO))
_log.info("Listening for connections on: " + dest.calculateHash().toBase64());
FileOutputStream fos = null;
try {
fos = new FileOutputStream(_destFile);
@@ -72,24 +77,16 @@ public class StreamSinkServer {
}
I2PServerSocket sock = mgr.getServerSocket();
while (true) {
try {
I2PSocket curSock = sock.accept();
handle(curSock);
} catch (I2PException ie) {
_log.error("Error accepting connection", ie);
return;
} catch (ConnectException ce) {
_log.error("Connection already dropped", ce);
return;
}
}
startup(sock);
}
private void handle(I2PSocket socket) {
I2PThread t = new I2PThread(new ClientRunner(socket));
t.setName("Handle " + socket.getPeerDestination().calculateHash().toBase64().substring(0,4));
t.start();
public void startup(I2PServerSocket sock) {
for (int i = 0; i < _handlers; i++) {
I2PThread t = new I2PThread(new ClientRunner(sock));
t.setName("Handler " + i);
t.setDaemon(false);
t.start();
}
}
/**
@@ -97,27 +94,44 @@ public class StreamSinkServer {
*
*/
private class ClientRunner implements Runnable {
private I2PSocket _sock;
private FileOutputStream _fos;
public ClientRunner(I2PSocket socket) {
_sock = socket;
private I2PServerSocket _socket;
public ClientRunner(I2PServerSocket socket) {
_socket = socket;
}
public void run() {
while (true) {
try {
I2PSocket socket = _socket.accept();
if (socket != null)
handle(socket);
} catch (I2PException ie) {
_log.error("Error accepting connection", ie);
return;
} catch (ConnectException ce) {
_log.error("Connection already dropped", ce);
return;
}
}
}
private void handle(I2PSocket sock) {
FileOutputStream fos = null;
try {
File sink = new File(_sinkDir);
if (!sink.exists())
sink.mkdirs();
File cur = File.createTempFile("clientSink", ".dat", sink);
_fos = new FileOutputStream(cur);
System.out.println("Writing to " + cur.getAbsolutePath());
fos = new FileOutputStream(cur);
if (_log.shouldLog(Log.DEBUG))
_log.debug("Writing to " + cur.getAbsolutePath());
} catch (IOException ioe) {
_log.error("Error creating sink", ioe);
_fos = null;
return;
}
}
public void run() {
if (_fos == null) return;
long start = System.currentTimeMillis();
try {
InputStream in = _sock.getInputStream();
InputStream in = sock.getInputStream();
byte buf[] = new byte[4096];
long written = 0;
int read = 0;
@@ -125,47 +139,55 @@ public class StreamSinkServer {
//_fos.write(buf, 0, read);
written += read;
if (_log.shouldLog(Log.DEBUG))
_log.debug("read and wrote " + read);
_log.debug("read and wrote " + read + " (" + written + ")");
}
_fos.write(("written: [" + written + "]\n").getBytes());
fos.write(("written: [" + written + "]\n").getBytes());
long lifetime = System.currentTimeMillis() - start;
_log.error("Got EOF from client socket [written=" + written + " lifetime=" + lifetime + "]");
_log.info("Got EOF from client socket [written=" + written + " lifetime=" + lifetime + "]");
} catch (IOException ioe) {
_log.error("Error writing the sink", ioe);
} finally {
if (_fos != null) try { _fos.close(); } catch (IOException ioe) {}
if (_sock != null) try { _sock.close(); } catch (IOException ioe) {}
_log.error("Client socket closed");
if (fos != null) try { fos.close(); } catch (IOException ioe) {}
if (sock != null) try { sock.close(); } catch (IOException ioe) {}
_log.debug("Client socket closed");
}
}
}
/**
* Fire up the streaming server. <code>Usage: StreamSinkServer [i2cpHost i2cpPort] sinkDir ourDestFile</code><br />
* Fire up the streaming server. <code>Usage: StreamSinkServer [i2cpHost i2cpPort] sinkDir ourDestFile [numHandlers]</code><br />
* <ul>
* <li><b>sinkDir</b>: Directory to store received files in</li>
* <li><b>ourDestFile</b>: filename to write our binary destination to</li>
* <li><b>numHandlers</b>: how many concurrent connections to handle</li>
* </ul>
*/
public static void main(String args[]) {
StreamSinkServer server = null;
switch (args.length) {
case 0:
server = new StreamSinkServer("dataDir", "server.key", "localhost", 7654);
server = new StreamSinkServer("dataDir", "server.key", "localhost", 7654, 3);
break;
case 2:
server = new StreamSinkServer(args[0], args[1]);
break;
case 4:
case 5:
int handlers = 3;
if (args.length == 5) {
try {
handlers = Integer.parseInt(args[4]);
} catch (NumberFormatException nfe) {}
}
try {
int port = Integer.parseInt(args[1]);
server = new StreamSinkServer(args[2], args[3], args[0], port);
server = new StreamSinkServer(args[2], args[3], args[0], port, handlers);
} catch (NumberFormatException nfe) {
System.out.println("Usage: StreamSinkServer [i2cpHost i2cpPort] sinkDir ourDestFile");
System.out.println("Usage: StreamSinkServer [i2cpHost i2cpPort] sinkDir ourDestFile [handlers]");
}
break;
default:
System.out.println("Usage: StreamSinkServer [i2cpHost i2cpPort] sinkDir ourDestFile");
System.out.println("Usage: StreamSinkServer [i2cpHost i2cpPort] sinkDir ourDestFile [handlers]");
}
if (server != null)
server.runServer();

View File

@@ -17,7 +17,7 @@
<property location="doc/q/api" name="javadoc.dir"/>
<property name="project.name" value="${ant.project.name}"/>
<property location="${project.name}.jar" name="jar"/>
<property location="${project.name}.war" name="war"/>
<property location="q.war" name="war"/>
</target>
<target name="builddep">
@@ -59,9 +59,9 @@
<!-- To make a standalone app, insert into <jar>: -->
<!-- <manifest><attribute name="Main-Class" value="com.foo.Main"/></manifest> -->
<war compress="true" jarfile="${war}" webxml="web.xml">
<!-- <fileset dir="${classes.dir}" includes="**/QConsole.class"/> -->
<classes file="build/net/i2p/aum/q/QConsole.class"/>
<classes file="build/HTML/**"/>
<!-- <fileset file="build/net/i2p/aum/q/QConsole.class"/> -->
<classes dir="build" includes="**/QConsole.class"/>
<classes dir="build" includes="**/HTML/**"/>
<!-- <fileset includes="**/HTML/*.class"/> -->
<lib file="xmlrpc.jar"/>
</war>

View File

@@ -157,6 +157,8 @@ public abstract class QNode extends Thread
*/
public String nodeType = "(base)";
public boolean isRunning;
// ----------------------------------------------------------
// CONSTRUCTORS
@@ -580,6 +582,13 @@ public abstract class QNode extends Thread
System.out.println("scheduleStartupJobs: c<p="+updateCatalogFromPeers+", isClient="+isClient);
}
public void scheduleShutdown()
{
Hashtable job = new Hashtable();
job.put("cmd", "shutdown");
runAfter(1000, job, "shutdown");
}
public void schedulePeerUploadJob(QDataItem item)
{
String uri = (String)item.get("uri");
@@ -790,6 +799,8 @@ public abstract class QNode extends Thread
{
log.info("Starting background tasks");
isRunning = true;
// mark our start time
nodeStartTime = new Date();
@@ -833,7 +844,7 @@ public abstract class QNode extends Thread
// fetch items from the job queue, and launch
// threads to execute them
while (true)
while (isRunning)
{
// get a thread slot from the thread pool
try {

View File

@@ -341,13 +341,15 @@ public class QServerMethods {
//System.out.println("shutdown: our privkey="+node.privKeyStr);
//System.out.println("shutdown: nodePrivKey="+nodePrivKey);
if (nodePrivKey.equals(node.privKeyStr)) {
res.put("status", "ok");
//node.scheduleShutdown();
// get a runtime
System.out.println("Node at "+node.dataDir+" shutting down");
//System.out.println("Node at "+node.dataDir+" shutting down");
Runtime r = Runtime.getRuntime();
// and terminate the vm
r.exit(0);
//r.halt(0);
//r.exit(0);
r.halt(0);
}
else {
res.put("status", "error");

View File

@@ -62,6 +62,9 @@ class QWorkerThread extends Thread {
else if (cmd.equals("test")) {
doTest();
}
else if (cmd.equals("shutdown")) {
doShutdown();
}
else {
node.log.error("workerthread.run: unrecognised command '"+cmd+"'");
System.out.println("workerthread.run: unrecognised command '"+cmd+"'");
@@ -90,6 +93,21 @@ class QWorkerThread extends Thread {
System.out.println("TESTJOB: msg='"+msg+"'");
}
public void doShutdown() throws Exception {
try {
new File(node.jobsDir + node.sep + jobTime).delete();
new File(node.jobsDir + node.sep + jobTime + ".desc").delete();
} catch (Exception e) {
e.printStackTrace();
}
SimpleFile f = new SimpleFile("/tmp/eeee", "rws");
f.write("xxx");
node.isRunning = false;
Runtime.getRuntime().halt(0);
}
public void doLocalPutItem() throws Exception {
Hashtable metadata = (Hashtable)job.get("metadata");
String path = (String)job.get("localDataFilePath");

View File

@@ -27,8 +27,10 @@ public class ConfigNetHandler extends FormHandler {
private boolean _guessRequested;
private boolean _reseedRequested;
private boolean _saveRequested;
private boolean _recheckReachabilityRequested;
private boolean _timeSyncEnabled;
private String _port;
private String _tcpPort;
private String _udpPort;
private String _inboundRate;
private String _inboundBurst;
private String _outboundRate;
@@ -43,6 +45,8 @@ public class ConfigNetHandler extends FormHandler {
reseed();
} else if (_saveRequested) {
saveChanges();
} else if (_recheckReachabilityRequested) {
recheckReachability();
} else {
// noop
}
@@ -52,12 +56,16 @@ public class ConfigNetHandler extends FormHandler {
public void setReseed(String moo) { _reseedRequested = true; }
public void setSave(String moo) { _saveRequested = true; }
public void setEnabletimesync(String moo) { _timeSyncEnabled = true; }
public void setRecheckReachability(String moo) { _recheckReachabilityRequested = true; }
public void setHostname(String hostname) {
_hostname = (hostname != null ? hostname.trim() : null);
}
public void setPort(String port) {
_port = (port != null ? port.trim() : null);
public void setTcpPort(String port) {
_tcpPort = (port != null ? port.trim() : null);
}
public void setUdpPort(String port) {
_udpPort = (port != null ? port.trim() : null);
}
public void setInboundrate(String rate) {
_inboundRate = (rate != null ? rate.trim() : null);
@@ -191,6 +199,11 @@ public class ConfigNetHandler extends FormHandler {
fos.close();
}
private void recheckReachability() {
_context.commSystem().recheckReachability();
addFormNotice("Rechecking router reachability...");
}
/**
* The user made changes to the network config and wants to save them, so
* lets go ahead and do so.
@@ -207,14 +220,25 @@ public class ConfigNetHandler extends FormHandler {
restartRequired = true;
}
}
if ( (_port != null) && (_port.length() > 0) ) {
if ( (_tcpPort != null) && (_tcpPort.length() > 0) ) {
String oldPort = _context.router().getConfigSetting(ConfigNetHelper.PROP_I2NP_TCP_PORT);
if ( (oldPort == null) && (_port.equals("8887")) ) {
if ( (oldPort == null) && (_tcpPort.equals("8887")) ) {
// still on default.. noop
} else if ( (oldPort == null) || (!oldPort.equalsIgnoreCase(_port)) ) {
} else if ( (oldPort == null) || (!oldPort.equalsIgnoreCase(_tcpPort)) ) {
// its not the default OR it has changed
_context.router().setConfigSetting(ConfigNetHelper.PROP_I2NP_TCP_PORT, _port);
addFormNotice("Updating TCP port from " + oldPort + " to " + _port);
_context.router().setConfigSetting(ConfigNetHelper.PROP_I2NP_TCP_PORT, _tcpPort);
addFormNotice("Updating TCP port from " + oldPort + " to " + _tcpPort);
restartRequired = true;
}
}
if ( (_udpPort != null) && (_udpPort.length() > 0) ) {
String oldPort = _context.router().getConfigSetting(ConfigNetHelper.PROP_I2NP_UDP_PORT);
if ( (oldPort == null) && (_udpPort.equals("8887")) ) {
// still on default.. noop
} else if ( (oldPort == null) || (!oldPort.equalsIgnoreCase(_udpPort)) ) {
// its not the default OR it has changed
_context.router().setConfigSetting(ConfigNetHelper.PROP_I2NP_TCP_PORT, _udpPort);
addFormNotice("Updating UDP port from " + oldPort + " to " + _udpPort);
restartRequired = true;
}
}

View File

@@ -24,11 +24,13 @@ public class ConfigNetHelper {
/** copied from various private TCP components */
public final static String PROP_I2NP_TCP_HOSTNAME = "i2np.tcp.hostname";
public final static String PROP_I2NP_TCP_PORT = "i2np.tcp.port";
public final static String PROP_I2NP_UDP_PORT = "i2np.udp.port";
public final static String PROP_I2NP_INTERNAL_UDP_PORT = "i2np.udp.internalPort";
public String getHostname() {
return _context.getProperty(PROP_I2NP_TCP_HOSTNAME);
}
public String getPort() {
public String getTcpPort() {
int port = 8887;
String val = _context.getProperty(PROP_I2NP_TCP_PORT);
if (val != null) {
@@ -41,6 +43,21 @@ public class ConfigNetHelper {
return "" + port;
}
public String getUdpPort() {
int port = 8887;
String val = _context.getProperty(PROP_I2NP_UDP_PORT);
if (val == null)
val = _context.getProperty(PROP_I2NP_INTERNAL_UDP_PORT);
if (val != null) {
try {
port = Integer.parseInt(val);
} catch (NumberFormatException nfe) {
// ignore, use default from above
}
}
return "" + port;
}
public String getEnableTimeSyncChecked() {
String disabled = _context.getProperty(Timestamper.PROP_DISABLED, "false");
if ( (disabled != null) && ("true".equalsIgnoreCase(disabled)) )

View File

@@ -0,0 +1,96 @@
package net.i2p.router.web;
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.util.ArrayList;
import java.util.List;
import java.util.StringTokenizer;
import net.i2p.util.Log;
import net.i2p.stat.StatManager;
/**
* Handler to deal with form submissions from the stats config form and act
* upon the values.
*
*/
public class ConfigStatsHandler extends FormHandler {
private String _filename;
private List _stats;
private boolean _explicitFilter;
private String _explicitFilterValue;
public ConfigStatsHandler() {
super();
_stats = new ArrayList();
_explicitFilter = false;
}
protected void processForm() {
saveChanges();
}
public void setFilename(String filename) {
_filename = (filename != null ? filename.trim() : null);
}
public void setStatList(String stats[]) {
if (stats != null) {
for (int i = 0; i < stats.length; i++) {
String cur = stats[i].trim();
if (_log.shouldLog(Log.DEBUG))
_log.debug("Stat: [" + cur + "]");
if ( (cur.length() > 0) && (!_stats.contains(cur)) )
_stats.add(cur);
}
}
if (_log.shouldLog(Log.DEBUG))
_log.debug("Updated stats: " + _stats);
}
public void setExplicitFilter(String foo) { _explicitFilter = true; }
public void setExplicitFilterValue(String filter) { _explicitFilterValue = filter; }
/**
* The user made changes to the config and wants to save them, so
* lets go ahead and do so.
*
*/
private void saveChanges() {
if (_filename == null)
_filename = StatManager.DEFAULT_STAT_FILE;
_context.router().setConfigSetting(StatManager.PROP_STAT_FILE, _filename);
if (_explicitFilter) {
_stats.clear();
if (_explicitFilterValue.indexOf(',') != -1) {
StringTokenizer tok = new StringTokenizer(_explicitFilterValue, ",");
while (tok.hasMoreTokens()) {
String cur = tok.nextToken().trim();
if ( (cur.length() > 0) && (!_stats.contains(cur)) )
_stats.add(cur);
}
} else {
String stat = _explicitFilterValue.trim();
if ( (stat.length() > 0) && (!_stats.contains(stat)) )
_stats.add(stat);
}
}
StringBuffer stats = new StringBuffer();
for (int i = 0; i < _stats.size(); i++) {
stats.append((String)_stats.get(i));
if (i + 1 < _stats.size())
stats.append(',');
}
_context.router().setConfigSetting(StatManager.PROP_STAT_FILTER, stats.toString());
boolean ok = _context.router().saveConfig();
if (ok)
addFormNotice("Stat filter and location updated successfully to: " + stats.toString());
else
addFormError("Failed to update the stat filter and location");
}
}

View File

@@ -0,0 +1,125 @@
package net.i2p.router.web;
import java.util.ArrayList;
import java.util.HashSet;
import java.util.Iterator;
import java.util.List;
import java.util.Map;
import java.util.Set;
import java.util.StringTokenizer;
import net.i2p.stat.RateStat;
import net.i2p.stat.FrequencyStat;
import net.i2p.router.RouterContext;
import net.i2p.util.Log;
public class ConfigStatsHelper {
private RouterContext _context;
private Log _log;
private String _filter;
private Set _filters;
/** list of names of stats which are remaining, ordered by nested groups */
private List _stats;
private String _currentStatName;
private String _currentStatDescription;
private String _currentGroup;
/** true if the current stat is the first in the group */
private boolean _currentIsFirstInGroup;
/** true if the stat is being logged */
private boolean _currentIsLogged;
/**
* Configure this bean to query a particular router context
*
* @param contextId begging few characters of the routerHash, or null to pick
* the first one we come across.
*/
public void setContextId(String contextId) {
try {
_context = ContextHelper.getContext(contextId);
_log = _context.logManager().getLog(ConfigStatsHelper.class);
} catch (Throwable t) {
t.printStackTrace();
}
_stats = new ArrayList();
Map groups = _context.statManager().getStatsByGroup();
for (Iterator iter = groups.values().iterator(); iter.hasNext(); ) {
Set stats = (Set)iter.next();
for (Iterator statIter = stats.iterator(); statIter.hasNext(); )
_stats.add(statIter.next());
}
_filter = _context.statManager().getStatFilter();
if (_filter == null)
_filter = "";
_filters = new HashSet();
StringTokenizer tok = new StringTokenizer(_filter, ",");
while (tok.hasMoreTokens())
_filters.add(tok.nextToken().trim());
}
public ConfigStatsHelper() {}
public String getFilename() { return _context.statManager().getStatFile(); }
/**
* move the cursor to the next known stat, returning true if a valid
* stat is available.
*
* @return true if a valid stat is available, otherwise false
*/
public boolean hasMoreStats() {
if (_stats.size() <= 0)
return false;
_currentStatName = (String)_stats.remove(0);
RateStat rs = _context.statManager().getRate(_currentStatName);
if (rs != null) {
_currentStatDescription = rs.getDescription();
if (_currentGroup == null)
_currentIsFirstInGroup = true;
else if (!rs.getGroupName().equals(_currentGroup))
_currentIsFirstInGroup = true;
else
_currentIsFirstInGroup = false;
_currentGroup = rs.getGroupName();
} else {
FrequencyStat fs = _context.statManager().getFrequency(_currentStatName);
if (fs != null) {
_currentStatDescription = fs.getDescription();
if (_currentGroup == null)
_currentIsFirstInGroup = true;
else if (!fs.getGroupName().equals(_currentGroup))
_currentIsFirstInGroup = true;
else
_currentIsFirstInGroup = false;
_currentGroup = fs.getGroupName();
} else {
if (_log.shouldLog(Log.ERROR))
_log.error("Stat does not exist?! [" + _currentStatName + "]");
return false;
}
}
if (_filters.contains("*") || _filters.contains(_currentStatName))
_currentIsLogged = true;
else
_currentIsLogged = false;
return true;
}
/** Is the current stat the first in the group? */
public boolean groupRequired() {
if (_currentIsFirstInGroup) {
_currentIsFirstInGroup = false;
return true;
} else {
return false;
}
}
/** What group is the current stat in */
public String getCurrentGroupName() { return _currentGroup; }
public String getCurrentStatName() { return _currentStatName; }
public String getCurrentStatDescription() { return _currentStatDescription; }
public boolean getCurrentIsLogged() { return _currentIsLogged; }
public String getExplicitFilter() { return _filter; }
}

View File

@@ -1,6 +1,10 @@
package net.i2p.router.web;
import net.i2p.I2PAppContext;
import net.i2p.data.DataHelper;
import net.i2p.router.Router;
import net.i2p.router.web.ConfigServiceHandler.UpdateWrapperManagerTask;
import net.i2p.util.Log;
/**
*
@@ -31,6 +35,15 @@ public class ConfigUpdateHandler extends FormHandler {
public static final String DEFAULT_PROXY_PORT = "4444";
protected void processForm() {
if ("Check for update now".equals(_action)) {
NewsFetcher fetcher = NewsFetcher.getInstance(I2PAppContext.getGlobalContext());
fetcher.fetchNews();
if (fetcher.updateAvailable())
addFormNotice("Update available, click link on left");
else
addFormNotice("No update available");
}
if ( (_newsURL != null) && (_newsURL.length() > 0) ) {
String oldURL = _context.router().getConfigSetting(PROP_NEWS_URL);
if ( (oldURL == null) || (!_newsURL.equals(oldURL)) ) {
@@ -38,6 +51,7 @@ public class ConfigUpdateHandler extends FormHandler {
addFormNotice("Updating news URL to " + _newsURL);
}
}
if ( (_updateURL != null) && (_updateURL.length() > 0) ) {
String oldURL = _context.router().getConfigSetting(PROP_UPDATE_URL);
if ( (oldURL == null) || (!_updateURL.equals(oldURL)) ) {
@@ -56,7 +70,7 @@ public class ConfigUpdateHandler extends FormHandler {
if ( (_proxyPort != null) && (_proxyPort.length() > 0) ) {
String oldPort = _context.router().getConfigSetting(PROP_PROXY_PORT);
if ( (oldPort == null) || (!_proxyHost.equals(oldPort)) ) {
if ( (oldPort == null) || (!_proxyPort.equals(oldPort)) ) {
_context.router().setConfigSetting(PROP_PROXY_PORT, _proxyPort);
addFormNotice("Updating proxy port to " + _proxyPort);
}

View File

@@ -4,6 +4,7 @@ import java.util.List;
import java.util.ArrayList;
import net.i2p.router.RouterContext;
import net.i2p.util.Log;
/**
* Simple form handler base class - does not depend on servlets or jsp,
@@ -16,8 +17,10 @@ import net.i2p.router.RouterContext;
*/
public class FormHandler {
protected RouterContext _context;
protected Log _log;
private String _nonce;
protected String _action;
protected String _passphrase;
private List _errors;
private List _notices;
private boolean _processed;
@@ -30,6 +33,7 @@ public class FormHandler {
_processed = false;
_valid = true;
_nonce = null;
_passphrase = null;
}
/**
@@ -41,6 +45,7 @@ public class FormHandler {
public void setContextId(String contextId) {
try {
_context = ContextHelper.getContext(contextId);
_log = _context.logManager().getLog(getClass());
} catch (Throwable t) {
t.printStackTrace();
}
@@ -48,6 +53,7 @@ public class FormHandler {
public void setNonce(String val) { _nonce = val; }
public void setAction(String val) { _action = val; }
public void setPassphrase(String val) { _passphrase = val; }
/**
* Override this to perform the final processing (in turn, adding formNotice
@@ -116,8 +122,14 @@ public class FormHandler {
String noncePrev = System.getProperty(getClass().getName() + ".noncePrev");
if ( ( (nonce == null) || (!_nonce.equals(nonce)) ) &&
( (noncePrev == null) || (!_nonce.equals(noncePrev)) ) ) {
addFormError("Invalid nonce, are you being spoofed?");
_valid = false;
String expected = _context.getProperty("consolePassword");
if ( (expected != null) && (expected.trim().length() > 0) && (expected.equals(_passphrase)) ) {
// ok
} else {
addFormError("Invalid nonce, are you being spoofed?");
_valid = false;
}
}
}

View File

@@ -91,7 +91,7 @@ public class NewsFetcher implements Runnable, EepGet.StatusListener {
return false;
}
}
private void fetchNews() {
public void fetchNews() {
String newsURL = _context.getProperty(ConfigUpdateHandler.PROP_NEWS_URL, ConfigUpdateHandler.DEFAULT_NEWS_URL);
boolean shouldProxy = Boolean.valueOf(_context.getProperty(ConfigUpdateHandler.PROP_SHOULD_PROXY, ConfigUpdateHandler.DEFAULT_SHOULD_PROXY)).booleanValue();
String proxyHost = _context.getProperty(ConfigUpdateHandler.PROP_PROXY_HOST, ConfigUpdateHandler.DEFAULT_PROXY_HOST);

View File

@@ -25,8 +25,11 @@ public class NoticeHelper {
public String getSystemNotice() {
if (_context.router().gracefulShutdownInProgress()) {
return "Graceful shutdown in "
+ DataHelper.formatDuration(_context.router().getShutdownTimeRemaining());
long remaining = _context.router().getShutdownTimeRemaining();
if (remaining > 0)
return "Graceful shutdown in " + DataHelper.formatDuration(remaining);
else
return "Graceful shutdown imminent, please be patient as state is written to disk";
} else {
return "";
}

View File

@@ -0,0 +1,37 @@
package net.i2p.router.web;
import java.io.IOException;
import java.io.Writer;
import net.i2p.router.RouterContext;
public class PeerHelper {
private RouterContext _context;
private Writer _out;
/**
* Configure this bean to query a particular router context
*
* @param contextId begging few characters of the routerHash, or null to pick
* the first one we come across.
*/
public void setContextId(String contextId) {
try {
_context = ContextHelper.getContext(contextId);
} catch (Throwable t) {
t.printStackTrace();
}
}
public PeerHelper() {}
public void setOut(Writer out) { _out = out; }
public String getPeerSummary() {
try {
_context.commSystem().renderStatusHTML(_out);
} catch (IOException ioe) {
ioe.printStackTrace();
}
return "";
}
}

View File

@@ -12,6 +12,7 @@ import net.i2p.data.Destination;
import net.i2p.data.LeaseSet;
import net.i2p.stat.Rate;
import net.i2p.stat.RateStat;
import net.i2p.router.CommSystemFacade;
import net.i2p.router.Router;
import net.i2p.router.RouterContext;
import net.i2p.router.RouterVersion;
@@ -97,6 +98,23 @@ public class SummaryHelper {
return (_context.netDb().getKnownRouters() < 10);
}
public int getAllPeers() { return _context.netDb().getKnownRouters(); }
public String getReachability() {
int status = _context.commSystem().getReachabilityStatus();
switch (status) {
case CommSystemFacade.STATUS_OK:
return "OK";
case CommSystemFacade.STATUS_DIFFERENT:
return "ERR-SymmetricNAT";
case CommSystemFacade.STATUS_REJECT_UNSOLICITED:
return "ERR-Reject";
case CommSystemFacade.STATUS_UNKNOWN: // fallthrough
default:
return "Unknown";
}
}
/**
* Retrieve amount of used memory.
*
@@ -189,6 +207,7 @@ public class SummaryHelper {
return "0.0";
RateStat receiveRate = _context.statManager().getRate("transport.receiveMessageSize");
if (receiveRate == null) return "0.0";
Rate rate = receiveRate.getRate(60*1000);
double bytes = rate.getLastTotalValue();
double bps = (bytes*1000.0d)/(rate.getPeriod()*1024.0d);
@@ -206,6 +225,7 @@ public class SummaryHelper {
return "0.0";
RateStat receiveRate = _context.statManager().getRate("transport.sendMessageSize");
if (receiveRate == null) return "0.0";
Rate rate = receiveRate.getRate(60*1000);
double bytes = rate.getLastTotalValue();
double bps = (bytes*1000.0d)/(rate.getPeriod()*1024.0d);
@@ -224,6 +244,7 @@ public class SummaryHelper {
return "0.0";
RateStat receiveRate = _context.statManager().getRate("transport.receiveMessageSize");
if (receiveRate == null) return "0.0";
Rate rate = receiveRate.getRate(5*60*1000);
double bytes = rate.getLastTotalValue();
double bps = (bytes*1000.0d)/(rate.getPeriod()*1024.0d);
@@ -242,6 +263,7 @@ public class SummaryHelper {
return "0.0";
RateStat receiveRate = _context.statManager().getRate("transport.sendMessageSize");
if (receiveRate == null) return "0.0";
Rate rate = receiveRate.getRate(5*60*1000);
double bytes = rate.getLastTotalValue();
double bps = (bytes*1000.0d)/(rate.getPeriod()*1024.0d);

View File

@@ -28,13 +28,15 @@
<input type="hidden" name="nonce" value="<%=System.getProperty("net.i2p.router.web.ConfigNetHandler.nonce")%>" />
<input type="hidden" name="action" value="blah" />
TCP port:
<input name="port" type="text" size="4" value="<jsp:getProperty name="nethelper" property="port" />" /> <br />
UDP port: <i><jsp:getProperty name="nethelper" property="udpPort" /></i><br />
<!-- <input name="udpPort" type="text" size="5" value="<jsp:getProperty name="nethelper" property="udpPort" />" /><br /> -->
<b>You must poke a hole in your firewall or NAT (if applicable) to receive new inbound UDP packets on
this port from arbitrary peers (this requirement will be removed in i2p 0.6.1, but is necessary now)</b><br />
TCP port: <input name="tcpPort" type="text" size="5" value="<jsp:getProperty name="nethelper" property="tcpPort" />" /> <br />
<b>You must poke a hole in your firewall or NAT (if applicable) so that you can receive inbound TCP
connections on it.</b> Nothing will work if you don't. Sorry. We know how to make it so
this restriction won't be necessary, but its later on in the
<a href="http://www.i2p.net/roadmap">roadmap</a> and we only have so many coder-hours (but if you want
to help, please <a href="http://www.i2p.net/getinvolved">get involved!</a>)
connections on it (this requirement will be removed in i2p 0.6.1, but is necessary now)</b>
<br />
<input type="submit" name="recheckReachability" value="Check network reachability..." />
<hr />
<b>Bandwidth limiter</b><br />
@@ -57,7 +59,7 @@
packets on port 123 to one of the pool.ntp.org machines (or some other SNTP server).</i>
<hr />
<input type="submit" name="save" value="Save changes" /> <input type="reset" value="Cancel" /><br />
<i>Changing the TCP port will force a 'soft restart' - dropping your connections and clients as
<i>Changing the TCP or UDP port will force a 'soft restart' - dropping your connections and clients as
if the router was stopped and restarted. <b>Please be patient</b> - it may take
a few seconds to complete.</i>
</form>
@@ -73,6 +75,13 @@
"i2p.reseedURL=someURL" (e.g. java -Di2p.reseedURL=http://dev.i2p.net/i2pdb/ ...). You can
also do it manually by getting routerInfo-*.dat files from someone (a friend, someone on IRC,
whatever) and saving them to your netDb/ directory.</p>
<p>
With the SSU transport, the internal UDP port may be different from the external
UDP port (in case of a firewall/NAT) - the UDP port field above specifies the
external one and assumes they are the same, but if you want to set the internal
port to something else, you can add "i2np.udp.internalPort=1234" to the
<a href="configadvanced.jsp">advanced</a> config and restart the router.
</p>
</div>
</body>

View File

@@ -8,5 +8,7 @@
%>Tunnels | <% } else { %><a href="configtunnels.jsp">Tunnels</a> | <% }
if (request.getRequestURI().indexOf("configlogging.jsp") != -1) {
%>Logging | <% } else { %><a href="configlogging.jsp">Logging</a> | <% }
if (request.getRequestURI().indexOf("configstats.jsp") != -1) {
%>Stats | <% } else { %><a href="configstats.jsp">Stats</a> | <% }
if (request.getRequestURI().indexOf("configadvanced.jsp") != -1) {
%>Advanced<% } else { %><a href="configadvanced.jsp">Advanced</a><% } %></h4>

View File

@@ -0,0 +1,104 @@
<%@page contentType="text/html"%>
<%@page pageEncoding="UTF-8"%>
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html><head>
<title>I2P Router Console - config stats</title>
<link rel="stylesheet" href="default.css" type="text/css" />
<script type="text/javascript">
function init()
{
checkAll = false;
}
function toggleAll(category)
{
var inputs = document.getElementsByTagName("input");
for(index = 0; index < inputs.length; index++)
{
if(inputs[index].id == category)
{
if(inputs[index].checked == 0)
{
inputs[index].checked = 1;
}
else if(inputs[index].checked == 1)
{
inputs[index].checked = 0;
}
}
if(category == '*')
{
if (checkAll == false)
{
inputs[index].checked = 1;
}
else if (checkAll == true)
{
inputs[index].checked = 0;
}
}
}
if(category == '*')
{
if (checkAll == false)
{
checkAll = true;
}
else if (checkAll == true)
{
checkAll = false;
}
}
}
</script>
</head><body onLoad="init();">
<%@include file="nav.jsp" %>
<%@include file="summary.jsp" %>
<div class="main" id="main">
<%@include file="confignav.jsp" %>
<jsp:useBean class="net.i2p.router.web.ConfigStatsHandler" id="formhandler" scope="request" />
<jsp:setProperty name="formhandler" property="contextId" value="<%=(String)session.getAttribute("i2p.contextId")%>" />
<jsp:setProperty name="formhandler" property="*" />
<font color="red"><jsp:getProperty name="formhandler" property="errors" /></font>
<i><jsp:getProperty name="formhandler" property="notices" /></i>
<jsp:useBean class="net.i2p.router.web.ConfigStatsHelper" id="statshelper" scope="request" />
<jsp:setProperty name="statshelper" property="contextId" value="<%=(String)session.getAttribute("i2p.contextId")%>" />
<form id="statsForm" name="statsForm" action="configstats.jsp" method="POST">
<% String prev = System.getProperty("net.i2p.router.web.ConfigStatsHandler.nonce");
if (prev != null) System.setProperty("net.i2p.router.web.ConfigStatsHandler.noncePrev", prev);
System.setProperty("net.i2p.router.web.ConfigStatsHandler.nonce", new java.util.Random().nextLong()+""); %>
<input type="hidden" name="action" value="foo" />
<input type="hidden" name="nonce" value="<%=System.getProperty("net.i2p.router.web.ConfigStatsHandler.nonce")%>" />
Stat file: <input type="text" name="filename" value="<%=statshelper.getFilename()%>" /><br />
Filter: (<a href="javascript: void(null);" onclick="toggleAll('*')">toggle all</a>)<br />
<table>
<% while (statshelper.hasMoreStats()) {
while (statshelper.groupRequired()) { %>
<tr><td valign="top" align="left" colspan="2">
<b><%=statshelper.getCurrentGroupName()%></b>
(<a href="javascript: void(null);" onclick="toggleAll('<%=statshelper.getCurrentGroupName()%>')">toggle all</a>)
</td></tr><%
} // end iterating over required groups for the current stat %>
<tr><td valign="top" align="left">
<input id="<%=statshelper.getCurrentGroupName()%>" type="checkbox" name="statList" value="<%=statshelper.getCurrentStatName()%>" <%
if (statshelper.getCurrentIsLogged()) { %>checked="true" <% } %>/></td>
<td valign="top" align="left"><b><%=statshelper.getCurrentStatName()%>:</b><br />
<%=statshelper.getCurrentStatDescription()%></td></tr><%
} // end iterating over all stats %>
<tr><td colspan="2"><hr /></td></tr>
<tr><td><input type="checkbox" name="explicitFilter" /></td>
<td>Advanced filter:
<input type="text" name="explicitFilterValue" value="<%=statshelper.getExplicitFilter()%>" size="40" /></td></tr>
<tr><td colspan="2"><hr /></td></tr>
<tr><td><input type="submit" name="shouldsave" value="Save changes" /> </td>
<td><input type="reset" value="Cancel" /></td></tr>
</form>
</table>
</div>
</body>
</html>

View File

@@ -27,7 +27,7 @@
if (prev != null) System.setProperty("net.i2p.router.web.ConfigUpdateHandler.noncePrev", prev);
System.setProperty("net.i2p.router.web.ConfigUpdateHandler.nonce", new java.util.Random().nextLong()+""); %>
<input type="hidden" name="nonce" value="<%=System.getProperty("net.i2p.router.web.ConfigUpdateHandler.nonce")%>" />
<input type="hidden" name="action" value="update" />
<input type="submit" name="action" value="Check for update now" /><br /><br />
News URL:
<input type="text" size="60" name="newsURL" value="<jsp:getProperty name="updatehelper" property="newsURL" />"><br />
Refresh frequency:
@@ -36,10 +36,10 @@
<input type="text" size="60" name="updateURL" value="<jsp:getProperty name="updatehelper" property="updateURL" />"><br />
Update policy:
<jsp:getProperty name="updatehelper" property="updatePolicySelectBox" /><br />
Update anonymously?
Update through the eepProxy?
<jsp:getProperty name="updatehelper" property="updateThroughProxy" /><br />
Proxy host: <input type="text" size="10" name="proxyHost" value="<jsp:getProperty name="updatehelper" property="proxyHost" />" /><br />
Proxy port: <input type="text" size="4" name="proxyPort" value="<jsp:getProperty name="updatehelper" property="proxyPort" />" /><br />
eepProxy host: <input type="text" size="10" name="proxyHost" value="<jsp:getProperty name="updatehelper" property="proxyHost" />" /><br />
eepProxy port: <input type="text" size="4" name="proxyPort" value="<jsp:getProperty name="updatehelper" property="proxyPort" />" /><br />
<!-- prompt for the eepproxy -->
Trusted keys:
<textarea name="trustedKeys" disabled="true" cols="60" rows="2"><jsp:getProperty name="updatehelper" property="trustedKeys" /></textarea>

View File

@@ -0,0 +1,21 @@
<%@page contentType="text/html"%>
<%@page pageEncoding="UTF-8"%>
<!DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01 Transitional//EN">
<html><head>
<title>I2P Router Console - peer connections</title>
<link rel="stylesheet" href="default.css" type="text/css" />
</head><body>
<%@include file="nav.jsp" %>
<%@include file="summary.jsp" %>
<div class="main" id="main">
<jsp:useBean class="net.i2p.router.web.PeerHelper" id="peerHelper" scope="request" />
<jsp:setProperty name="peerHelper" property="contextId" value="<%=(String)session.getAttribute("i2p.contextId")%>" />
<jsp:setProperty name="peerHelper" property="out" value="<%=out%>" />
<jsp:getProperty name="peerHelper" property="peerSummary" />
</div>
</body>
</html>

View File

@@ -14,7 +14,8 @@
<b>Version:</b> <jsp:getProperty name="helper" property="version" /><br />
<b>Uptime:</b> <jsp:getProperty name="helper" property="uptime" /><br />
<b>Now:</b> <jsp:getProperty name="helper" property="time" /><br />
<b>Memory:</b> <jsp:getProperty name="helper" property="memory" /><br /><%
<b>Memory:</b> <jsp:getProperty name="helper" property="memory" /><br />
<b>Status:</b> <a href="config.jsp"><jsp:getProperty name="helper" property="reachability" /></a><br /><%
if (helper.updateAvailable()) {
if ("true".equals(System.getProperty("net.i2p.router.web.UpdateHandler.updateInProgress", "false"))) {
out.print(update.getStatus());
@@ -33,13 +34,14 @@
}
%><hr />
<u><b>Peers</b></u><br />
<u><b><a href="peers.jsp">Peers</a></b></u><br />
<b>Active:</b> <jsp:getProperty name="helper" property="activePeers" />/<jsp:getProperty name="helper" property="activeProfiles" /><br />
<b>Fast:</b> <jsp:getProperty name="helper" property="fastPeers" /><br />
<b>High capacity:</b> <jsp:getProperty name="helper" property="highCapacityPeers" /><br />
<b>Well integrated:</b> <jsp:getProperty name="helper" property="wellIntegratedPeers" /><br />
<b>Failing:</b> <jsp:getProperty name="helper" property="failingPeers" /><br />
<b>Shitlisted:</b> <jsp:getProperty name="helper" property="shitlistedPeers" /><br /><%
<b>Shitlisted:</b> <jsp:getProperty name="helper" property="shitlistedPeers" /><br />
<b>Known:</b> <jsp:getProperty name="helper" property="allPeers" /><br /><%
if (helper.getActivePeers() <= 0) {
%><b><a href="config.jsp">check your NAT/firewall</a></b><br /><%
}

View File

@@ -1,4 +1,4 @@
<?xml version="1.0" encoding="UTF-8"?>
<?xml version="1.0" encoding="ISO-8859-1"?>
<!DOCTYPE web-app
PUBLIC "-//Sun Microsystems, Inc.//DTD Web Application 2.2//EN"
"http://java.sun.com/j2ee/dtds/web-app_2.2.dtd">
@@ -14,4 +14,4 @@
<welcome-file>index.html</welcome-file>
<welcome-file>index.jsp</welcome-file>
</welcome-file-list>
</web-app>
</web-app>

64
apps/sam/c/Makefile Normal file
View File

@@ -0,0 +1,64 @@
FLAGS+=-g
CFLAGS+=$(FLAGS)
LDFLAGS+=$(FLAGS)
OBJS:=obj/sam.lo obj/strl.lo obj/parse.lo obj/tinystring.lo
DEPS:=$(patsubst obj/%.lo, .deps/%.d, $(OBJS))
DESTDIR:=$(if $(DESTDIR),$(DESTDIR)/lib,/usr/lib)
MAKEFLAGS=-s -r
PERL=$(shell which perl 2>/dev/null)
ifneq ($(PERL),)
STATUS=$(PERL) ./status
else
STATUS=echo
endif
LIBTOOL_LOG=libtool.log
all:: cleanlog .deps/finish
cleanlog:
echo >$(LIBTOOL_LOG)
lib/libsam.so: obj/libsam.la
libtool --mode=install install $^ `pwd`/$@ >>$(LIBTOOL_LOG)
obj/libsam-static.la: $(OBJS)
$(STATUS) library '(static)'
libtool --mode=link gcc -static $(LDFLAGS) -o $@ $^ >>$(LIBTOOL_LOG)
obj/libsam.la: $(OBJS)
$(STATUS) library '(shared)'
libtool --mode=link gcc -rpath $(DESTDIR) $(LDFLAGS) -o $@ $^ >>$(LIBTOOL_LOG)
obj/%.lo: src/%.c
$(STATUS) compile $*
libtool --mode=compile gcc $(CFLAGS) -Iinc/ -c -o $@ $< >>$(LIBTOOL_LOG)
$(OBJS):|obj
obj:
$(STATUS) MKDIR $@
mkdir -p $@
.deps/%.d: src/%.c
$(STATUS) deps $*
gcc -Iinc/ -MM -MT obj/$*.o $< -o $@
-include $(DEPS)
DEPS+=.deps/finish
.deps/finish: lib/libsam.so
libtool --finish $(DESTDIR) >>$(LIBTOOL_LOG) && touch $@
$(DEPS):|.deps
.deps:
$(STATUS) MKDIR $@
mkdir -p $@
clean:
$(STATUS) clean
libtool --mode=clean rm -f obj/*.l* lib/*.l* lib/*.so* lib/*.a >>$(LIBTOOL_LOG)
rm -Rf .deps libtool.log
.PHONY: all cleanlog clean

View File

@@ -1,25 +0,0 @@
#
# This Makefile contains instructions common to all platforms
#
#
# Build rules
#
all: clean depend libsam
depend:
$(CC) $(CFLAGS) -MM $(SRCDIR)/*.c > .depend
$(OBJDIR)/%.o: $(SRCDIR)/%.c
$(CC) $(CFLAGS) -o $@ -c $<
libsam: $(OBJS)
$(AR) rcs $(LIBDIR)/libsam.a $(OBJS)
#
# Cleanup rules
#
clean:
-$(RM) -f $(LIBDIR)/libsam.a $(OBJDIR)/*.o .depend

View File

@@ -1,48 +0,0 @@
#
# This Makefile is compatible with GNU Make and should work on Cygwin
#
#
# Your operating system
#
OS = CYGWIN
#
# Directories
#
INCDIR = inc
LIBDIR = lib
OBJDIR = obj
SRCDIR = src
#
# Programs
#
AR = ar
CC = gcc
RM = rm
#
# Flags
#
CFLAGS = -g -O2 -pipe -std=c99 -Wall
CFLAGS += -DOS=$(OS)
CFLAGS += -I$(INCDIR)
#
# Object files
#
OBJS = $(OBJDIR)/sam.o \
$(OBJDIR)/snprintf.o \
$(OBJDIR)/strl.o
#
# Include the make instructions common to all platforms
#
include Makefile.common

View File

@@ -1,46 +0,0 @@
#
# This Makefile is compatible with GNU Make and should work on FreeBSD
#
#
# Your operating system
#
OS = FREEBSD
#
# Directories
#
INCDIR = inc
LIBDIR = lib
OBJDIR = obj
SRCDIR = src
#
# Programs
#
AR = ar
CC = gcc
RM = rm
#
# Flags
#
CFLAGS = -g -O2 -pipe -std=c99 -Wall
CFLAGS += -DOS=$(OS)
CFLAGS += -I$(INCDIR)
#
# Object files
#
OBJS = $(OBJDIR)/sam.o
#
# Include the make instructions common to all platforms
#
include Makefile.common

View File

@@ -1,47 +0,0 @@
#
# This Makefile is compatible with GNU Make and should work on Linux
#
#
# Your operating system
#
OS = LINUX
#
# Directories
#
INCDIR = inc
LIBDIR = lib
OBJDIR = obj
SRCDIR = src
#
# Programs
#
AR = ar
CC = gcc
RM = rm
#
# Flags
#
CFLAGS = -g -O2 -pipe -std=c99 -Wall
CFLAGS += -DOS=$(OS)
CFLAGS += -I$(INCDIR)
#
# Object files
#
OBJS = $(OBJDIR)/sam.o \
$(OBJDIR)/strl.o
#
# Include the make instructions common to all platforms
#
include Makefile.common

View File

@@ -1,47 +0,0 @@
#
# This Makefile is compatible with GNU Make and should work on Windows (MingW)
#
#
# Your operating system
#
OS = MINGW
#
# Directories
#
INCDIR = inc
LIBDIR = lib
OBJDIR = obj
SRCDIR = src
#
# Programs
#
AR = C:\MinGW\bin\ar
CC = C:\MinGW\bin\gcc
RM = C:\MinGW\bin\rm
#
# Flags
#
CFLAGS = -g -O2 -pipe -std=c99 -Wall
CFLAGS += -DOS=$(OS)
CFLAGS += -I$(INCDIR)
#
# Object files
#
OBJS = $(OBJDIR)/sam.o \
$(OBJDIR)/strl.o
#
# Include the make instructions common to all platforms
#
include Makefile.common

View File

@@ -1,39 +1,54 @@
#
# This Makefile is compatible with GNU Make and should work on POSIX systems
#
FLAGS+=-g
#
# Programs
#
CFLAGS = $(FLAGS) -pipe -std=c99 -Wall
CFLAGS += -I../../inc
LDFLAGS = $(FLAGS) -L../../lib -lsam
CC = gcc
INSTALL = install
RM = rm
OBJS:=i2p-ping.lo
DEPS:=$(patsubst obj/%.lo, .deps/%.d, $(OBJS))
DESTDIR:=$(if $(DESTDIR),$(DESTDIR)/lib,/usr/lib)
#
# Flags
#
MAKEFLAGS=-s -r
PERL=$(shell which perl 2>/dev/null)
ifneq ($(PERL),)
STATUS=$(PERL) ../../status
else
STATUS=echo
endif
CFLAGS = -g -O2 -pipe -std=c99 -Wall
CFLAGS += -I../../inc -L../../lib
LIBS = -lsam
LIBTOOL_LOG=libtool.log
#
# Build rules
#
all:: cleanlog i2p-ping
all: clean i2p-ping
cleanlog:
>$(LIBTOOL_LOG)
i2p-ping: i2p-ping.c
$(CC) $(CFLAGS) -o i2p-ping.o -c i2p-ping.c
$(CC) $(CFLAGS) -o i2p-ping i2p-ping.o $(LIBS)
i2p-ping: $(OBJS)
$(STATUS) link
libtool --mode=link gcc $(LDFLAGS) -o $@ $^ >>$(LIBTOOL_LOG)
install: i2p-ping
$(INSTALL) i2p-ping $(HOME)/bin
#
# Cleanup rules
#
%.lo: %.c
$(STATUS) compile $*
libtool --mode=compile gcc $(CFLAGS) -Iinc/ -c -o $@ $< >>$(LIBTOOL_LOG)
.deps/%.d: src/%.c
$(STATUS) deps $*
gcc -Iinc/ -MM -MT obj/$*.o $^ -o $@
clean:
-$(RM) -f i2p-ping *.o
$(STATUS) clean
rm -Rf .deps obj libtool.log
libtool --mode=clean rm -f i2p-ping i2p-ping.lo >>$(LIBTOOL_LOG)
$(OBJS):|obj
obj:
$(STATUS) MKDIR $@
mkdir -p $@
-include $(DEPS)
$(DEPS):|.deps
.deps:
$(STATUS) MKDIR $@
mkdir -p $@
.PHONY: all cleanlog clean

24
apps/sam/c/inc/parse.h Normal file
View File

@@ -0,0 +1,24 @@
#ifndef _PARSE_HEADER_FEEP
#define _PARSE_HEADER_FEEP
#include "tinystring.h"
typedef struct arg_s {
string_t name;
string_t value;
// int pos;
} arg_t;
typedef struct {
arg_t* arg;
int num;
} args_t;
args_t arg_parse(const char*);
void arg_done(args_t);
arg_t* arg_get(args_t,int);
arg_t* arg_find(args_t,string_t);
#define AG(a,b) arg_get(a,b)
#endif /* _PARSE_HEADER_FEEP */

View File

@@ -121,9 +121,9 @@ bool sam_read_buffer(sam_sess_t *session);
const char *sam_strerror(samerr_t code);
/* SAM controls - callbacks */
void (*sam_diedback)(sam_sess_t *session);
void (*sam_logback)(char *str);
void (*sam_namingback)(sam_sess_t *session, char *name,
sam_pubkey_t pubkey, samerr_t result);
void (*sam_logback)(const char *str);
void (*sam_namingback)(sam_sess_t *session, const char *name,
sam_pubkey_t pubkey, samerr_t result, const char* message);
/* Stream commands */
void sam_stream_close(sam_sess_t *session, sam_sid_t stream_id);
@@ -131,14 +131,15 @@ sam_sid_t sam_stream_connect(sam_sess_t *session, const sam_pubkey_t dest);
samerr_t sam_stream_send(sam_sess_t *session, sam_sid_t stream_id,
const void *data, size_t size);
/* Stream commands - callbacks */
void (*sam_closeback)(sam_sess_t *session, sam_sid_t stream_id,
samerr_t reason);
void (*sam_closeback)(sam_sess_t *session, sam_sid_t stream_id,
samerr_t reason, const char* message);
void (*sam_connectback)(sam_sess_t *session, sam_sid_t stream_id,
sam_pubkey_t dest);
void (*sam_databack)(sam_sess_t *session, sam_sid_t stream_id,
sam_pubkey_t dest);
void (*sam_databack)(sam_sess_t *session, sam_sid_t stream_id,
void *data, size_t size);
void (*sam_statusback)(sam_sess_t *session, sam_sid_t stream_id,
samerr_t result);
void (*sam_statusback)(sam_sess_t *session, sam_sid_t stream_id,
samerr_t result, const char* message);
/* Stream send queue (experimental) */
void sam_sendq_add(sam_sess_t *session, sam_sid_t stream_id,

View File

@@ -0,0 +1,48 @@
#ifndef TINYSTRING_HEADER
#define TINYSTRING_HEADER
#include <sys/types.h>
#ifndef bool
#define bool short int
#endif
struct string_s;
#define string_t struct string_s*
//Mysteeeerious *waggles mysteriously*
/*{
char* data;
long int size;
} *string_t;
*/
string_t string_create(const char*);
string_t string_ncreate(const char* cstr,long int length);
string_t string_wrap(const char*);
//Does not malloc, do NOT pass to string_free
string_t string_fmt(const char* fmt, ...);
string_t string_cat(string_t,string_t);
/* Source Dest */
void string_copy(string_t,string_t);
void string_copy_raw(string_t,void*,size_t);
const char* string_data(string_t);
long int string_size(string_t);
void string_free(string_t);
bool string_equal(string_t,string_t);
bool string_equali(string_t,string_t);
int string_cmp(string_t,string_t);
int string_cmpi(string_t,string_t);
#define _sw(a) string_wrap(a)
#define _scr(a,b,c) string_copy_raw(a,b,c)
#define string_is(a,b) (! strncmp(string_data(a),(b),string_size(a)))
#endif /* TINYSTRING_HEADER */

78
apps/sam/c/src/parse.c Normal file
View File

@@ -0,0 +1,78 @@
#include "parse.h"
#include <assert.h>
#include <ctype.h>
#include <malloc.h>
#define _GNU_SOURCE
#include <string.h>
args_t arg_parse(const char* line_raw) {
args_t self;
int numargs = 0;
const char *end, *last;
/* First pass to count how many args... */
end = line_raw;
while(*end && isspace(*end)) ++end;
//Skip initial space...
for(;;) {
while(*end && !isspace(*end)) ++end;
//Go to end of argument
++numargs;
while(*end && isspace(*end)) ++end;
//Go to end of space after argument
if(!*end) break;
}
self.num = numargs; // One more # args than spaces.
self.arg = malloc(sizeof(arg_t)*numargs);
/* Second pass to assign args. (Lemee alone, is more efficient than a linked list!) */
last = line_raw;
numargs = 0; //Now numargs is which current arg.
end = line_raw;
while(*end && isspace(*end)) ++end;
//Skip initial space...
for(;;) {
arg_t* nextarg = self.arg + numargs;;
const char* isbinary;
while(*end && !isspace(*end)) ++end;
//Go to end of argument
isbinary = strchr(last,'='); //Is there a value?
//Make sure not to pass end in our search for =
if(isbinary && (isbinary < end)) {
nextarg->name = string_ncreate(last,isbinary-last);
nextarg->value = string_ncreate(isbinary+1,end-isbinary-1);
} else {
nextarg->name = string_ncreate(last,end-last);
nextarg->value = string_create(NULL);
}
++numargs;
while(*end && isspace(*end)) ++end;
//Go to end of space after argument
if(!*end) break;
last = end;
}
return self;
}
void arg_done(args_t self) {
free(self.arg);
self.arg = NULL;
self.num = 0;
}
arg_t* arg_get(args_t self, int index) {
if(index >= self.num) return NULL;
return self.arg + index;
}
arg_t* arg_find(args_t self,string_t testname) {
int index;
for(index=0;index<self.num;++index) {
if(string_equali(self.arg[index].name,testname)) {
return self.arg + index;
}
}
return NULL;
}

View File

@@ -30,6 +30,10 @@
#include "sam.h"
#include "platform.h"
#include "parse.h"
#include "tinystring.h"
#include <assert.h>
static bool sam_hello(sam_sess_t *session);
static void sam_log(const char *format, ...);
@@ -57,7 +61,7 @@ static ssize_t sam_write(sam_sess_t *session, const void *buf, size_t n);
*/
/* a peer closed the connection */
void (*sam_closeback)(sam_sess_t *session, sam_sid_t stream_id, samerr_t reason)
void (*sam_closeback)(sam_sess_t *session, sam_sid_t stream_id, samerr_t reason, const char* message)
= NULL;
/* a peer connected to us */
@@ -76,15 +80,14 @@ void (*sam_dgramback)(sam_sess_t *session, sam_pubkey_t dest, void *data,
void (*sam_diedback)(sam_sess_t *session) = NULL;
/* logging callback */
void (*sam_logback)(char *str) = NULL;
void (*sam_logback)(const char *str) = NULL;
/* naming lookup reply - `pubkey' will be NULL if `result' isn't SAM_OK */
void (*sam_namingback)(sam_sess_t *session, char *name, sam_pubkey_t pubkey,
samerr_t result) = NULL;
void (*sam_namingback)(sam_sess_t *session, const char *name, sam_pubkey_t pubkey, samerr_t result, const char* message) = NULL;
/* our connection to a peer has completed */
void (*sam_statusback)(sam_sess_t *session, sam_sid_t stream_id,
samerr_t result) = NULL;
samerr_t result, const char* message) = NULL;
/* a peer sent some raw data (`data' MUST be freed) */
void (*sam_rawback)(sam_sess_t *session, void *data, size_t size) = NULL;
@@ -290,13 +293,13 @@ static void sam_log(const char *format, ...)
*/
void sam_naming_lookup(sam_sess_t *session, const char *name)
{
assert(session != NULL);
char cmd[SAM_CMD_LEN];
assert(session != NULL);
char cmd[SAM_CMD_LEN];
snprintf(cmd, sizeof cmd, "NAMING LOOKUP NAME=%s\n", name);
sam_write(session, cmd, strlen(cmd));
snprintf(cmd, sizeof cmd, "NAMING LOOKUP NAME=%s\n", name);
sam_write(session, cmd, strlen(cmd));
return;
return;
}
/*
@@ -304,242 +307,193 @@ void sam_naming_lookup(sam_sess_t *session, const char *name)
*
* s - string of data that we read (read past tense)
*/
bool sam_parse_args(sam_sess_t *session, args_t args);
static void sam_parse(sam_sess_t *session, char *s)
{
assert(session != NULL);
#define SAM_DGRAM_RECEIVED_REPLY "DATAGRAM RECEIVED"
#define SAM_NAMING_REPLY "NAMING REPLY"
#define SAM_NAMING_REPLY_OK "NAMING REPLY RESULT=OK"
#define SAM_NAMING_REPLY_IK "NAMING REPLY RESULT=INVALID_KEY"
#define SAM_NAMING_REPLY_KNF "NAMING REPLY RESULT=KEY_NOT_FOUND"
#define SAM_RAW_RECEIVED_REPLY "RAW RECEIVED"
#define SAM_STREAM_CLOSED_REPLY "STREAM CLOSED"
#define SAM_STREAM_CONNECTED_REPLY "STREAM CONNECTED"
#define SAM_STREAM_RECEIVED_REPLY "STREAM RECEIVED"
#define SAM_STREAM_STATUS_REPLY "STREAM STATUS"
#define SAM_STREAM_STATUS_REPLY_OK "STREAM STATUS RESULT=OK"
#define SAM_STREAM_STATUS_REPLY_CRP "STREAM STATUS RESULT=CANT_REACH_PEER"
#define SAM_STREAM_STATUS_REPLY_I2E "STREAM STATUS RESULT=I2P_ERROR"
#define SAM_STREAM_STATUS_REPLY_IK "STREAM STATUS RESULT=INVALID_KEY"
#define SAM_STREAM_STATUS_REPLY_TO "STREAM STATUS RESULT=TIMEOUT"
//Wrapper for ease of memory management
args_t args;
assert(session != NULL);
args = arg_parse(s);
if(!sam_parse_args(session, args)) {
SAMLOG("Unknown SAM command received: %s", s);
}
arg_done(args);
}
/*
* TODO: add raw parsing
*/
long int strtol_checked(const char* str) {
static char* end = NULL;
long int ret = strtol(str,&end,10);
assert(str != end || "No number found at all!");
return ret;
}
if (strncmp(s, SAM_DGRAM_RECEIVED_REPLY,
strlen(SAM_DGRAM_RECEIVED_REPLY)) == 0) {
char *p;
sam_pubkey_t dest;
size_t size;
void *data;
p = strchr(s, '='); /* DESTINATION= */
assert(p != NULL);
p++;
strlcpy(dest, p, sizeof dest);
p = strchr(p, '='); /* SIZE= */
assert(p != NULL);
p++;
size = strtol(p, NULL, 10);
assert(size != 0);
data = malloc(size + 1); /* +1 for NUL termination, so when we are
receiving a string it will just work and it
won't be necessary to send NUL. When binary
data is sent, the extra NUL character will
just be ignored by the client program,
because it is not added to the size */
if (data == NULL) {
SAMLOGS("Out of memory");
abort();
}
if (sam_read2(session, data, size) != -1) {
p = data + size;
*p = '\0'; /* see above NUL note */
sam_dgramback(session, dest, data, size); /* `data' must be freed */
} else
free(data);
return;
} else if (strncmp(s, SAM_NAMING_REPLY, strlen(SAM_NAMING_REPLY)) == 0) {
char *p;
char *q;
char name[SAM_NAME_LEN];
p = strchr(s, '='); /* can't use strrchar because of option
MESSAGE= */
assert(p != NULL); /* RESULT= */
p++;
p = strchr(p, '='); /* NAME= */
assert(p != NULL);
p++;
if (strncmp(s, SAM_NAMING_REPLY_OK, strlen(SAM_NAMING_REPLY_OK)) == 0) {
sam_pubkey_t pubkey;
q = strchr(p, ' '); /* ' 'VAL.. */
assert(q != NULL);
*q = '\0';
q++;
q = strchr(q, '='); /* VALUE= */
assert(q != NULL);
q++;
strlcpy(name, p, sizeof name);
strlcpy(pubkey, q, sizeof pubkey);
sam_namingback(session, name, pubkey, SAM_OK);
} else if (strncmp(s, SAM_NAMING_REPLY_IK,
strlen(SAM_NAMING_REPLY_IK)) == 0) {
q = strchr(p, ' '); /* ' 'MES.. (optional) */
if (q != NULL)
*q = '\0';
strlcpy(name, p, sizeof name);
sam_namingback(session, name, NULL, SAM_INVALID_KEY);
} else if (strncmp(s, SAM_NAMING_REPLY_KNF,
strlen(SAM_NAMING_REPLY_KNF)) == 0) {
q = strchr(p, ' '); /* ' 'MES.. (optional) */
if (q != NULL)
*q = '\0';
strlcpy(name, p, sizeof name);
sam_namingback(session, name, NULL, SAM_KEY_NOT_FOUND);
} else {
q = strchr(p, ' '); /* ' 'MES.. (optional) */
if (q != NULL)
*q = '\0';
strlcpy(name, p, sizeof name);
sam_namingback(session, name, NULL, SAM_UNKNOWN);
}
return;
} else if (strncmp(s, SAM_STREAM_CLOSED_REPLY,
strlen(SAM_STREAM_CLOSED_REPLY)) == 0) {
char *p;
sam_sid_t stream_id;
p = strchr(s, '='); /* can't use strrchar because of option MESSAGE= */
assert(p != NULL); /* ID= */
p++;
stream_id = strtol(p, NULL, 10);
assert(stream_id != 0);
p = strchr(p, '='); /* RESULT= */
assert(p != NULL);
p++;
if (strncmp(p, "OK", strlen("OK")) == 0)
sam_closeback(session, stream_id, SAM_OK);
else if (strncmp(p, "CANT_REACH_PEER", strlen("CANT_REACH_PEER")) == 0)
sam_closeback(session, stream_id, SAM_CANT_REACH_PEER);
else if (strncmp(p, "I2P_ERROR", strlen("I2P_ERROR")) == 0)
sam_closeback(session, stream_id, SAM_I2P_ERROR);
else if (strncmp(p, "PEER_NOT_FOUND", strlen("PEER_NOT_FOUND")) == 0)
sam_closeback(session, stream_id, SAM_PEER_NOT_FOUND);
else if (strncmp(p, "TIMEOUT", strlen("TIMEOUT")) == 0)
sam_closeback(session, stream_id, SAM_TIMEOUT);
else
sam_closeback(session, stream_id, SAM_UNKNOWN);
return;
} else if (strncmp(s, SAM_STREAM_CONNECTED_REPLY,
strlen(SAM_STREAM_CONNECTED_REPLY)) == 0) {
char *p;
sam_sid_t stream_id;
sam_pubkey_t dest;
p = strrchr(s, '='); /* ID= */
assert(p != NULL);
*p = '\0';
p++;
stream_id = strtol(p, NULL, 10);
assert(stream_id != 0);
p = strstr(s, "N="); /* DESTINATION= */
p += 2;
strlcpy(dest, p, sizeof dest);
sam_connectback(session, stream_id, dest);
bool sam_parse_args(sam_sess_t *session, args_t args)
{
arg_t* arg; // The current argument being examined...
const char* message = NULL; // Almost EVERYTHING can have a message...
return;
if(args.num <= 0) return 0;
} else if (strncmp(s, SAM_STREAM_RECEIVED_REPLY,
strlen(SAM_STREAM_RECEIVED_REPLY)) == 0) {
char *p;
sam_sid_t stream_id;
#define ARG_IS(a,b) string_equal(AG(args,a)->name,string_wrap(b))
#define ARG_FIND(a) arg_find(args,_sw(a))
// Almost EVERYTHING can have a message...
arg = ARG_FIND("MESSAGE");
if(arg) {
message = string_data(arg->value);
}
if(ARG_IS(0,"DATAGRAM") &&
ARG_IS(1,"RECEIVED")) {
sam_pubkey_t dest;
size_t size;
void *data;
arg = ARG_FIND("DESTINATION");
assert(arg != NULL);
_scr(arg->value, dest, sizeof dest);
arg = ARG_FIND("SIZE");
assert(arg != NULL);
size = strtol_checked(string_data(arg->value));
data = malloc(size + 1);
/* +1 for NUL termination, so when we are
receiving a string it will just work and it
won't be necessary to send NUL. When binary
data is sent, the extra NUL character will
just be ignored by the client program,
because it is not added to the size */
if (data == NULL) {
SAMLOGS("Out of memory");
abort();
}
if (sam_read2(session, data, size) != -1) {
char* p = data + size;
*p = '\0'; /* see above NUL note */
sam_dgramback(session, dest, data, size); /* `data' must be freed */
} else
free(data);
} else if (ARG_IS(0,"NAMING") &&
ARG_IS(1, "REPLY")) {
if(NULL == (arg = ARG_FIND("RESULT"))) {
SAMLOGS("Naming reply with no result");
return 0;
}
if (string_is(arg->value,"OK")) {
sam_pubkey_t pubkey;
arg = ARG_FIND("VALUE");
assert(arg != NULL);
_scr(arg->value, pubkey, sizeof pubkey);
arg = ARG_FIND("NAME");
assert(arg != NULL);
sam_namingback(session, string_data(arg->value), pubkey, SAM_OK, message);
} else if(string_is(arg->value,"INVALID_KEY")) {
arg_t* namearg = ARG_FIND("NAME");
assert(namearg != NULL);
sam_namingback(session, string_data(namearg->value), NULL,
SAM_INVALID_KEY, message);
} else if(string_is(arg->value,"KEY_NOT_FOUND")) {
arg_t* namearg = ARG_FIND("NAME");
assert(namearg != NULL);
sam_namingback(session, string_data(namearg->value), NULL,
SAM_KEY_NOT_FOUND, message);
} else {
arg_t* namearg = ARG_FIND("NAME");
assert(namearg != NULL);
sam_namingback(session, string_data(namearg->value), NULL,
SAM_UNKNOWN, message);
}
} else if (ARG_IS(0,"STREAM")) {
sam_sid_t stream_id;
arg = ARG_FIND("ID");
assert(arg != 0);
stream_id = strtol_checked(string_data(arg->value));
if(ARG_IS(1,"CLOSED")) {
arg = ARG_FIND("RESULT");
assert(arg != NULL);
if (string_is(arg->value,"OK")) {
sam_closeback(session, stream_id, SAM_OK, message);
} else if (string_is(arg->value,"CANT_REACH_PEER")) {
sam_closeback(session, stream_id, SAM_CANT_REACH_PEER, message);
} else if (string_is(arg->value,"I2P_ERROR")) {
sam_closeback(session, stream_id, SAM_I2P_ERROR, message);
} else if (string_is(arg->value,"PEER_NOT_FOUND")) {
sam_closeback(session, stream_id, SAM_PEER_NOT_FOUND, message);
} else if (string_is(arg->value,"TIMEOUT")) {
sam_closeback(session, stream_id, SAM_TIMEOUT, message);
} else {
sam_closeback(session, stream_id, SAM_UNKNOWN, message);
}
} else if(ARG_IS(1,"CONNECTED")) {
sam_pubkey_t dest;
arg = ARG_FIND("DESTINATION");
assert(arg != NULL);
_scr(arg->value, dest, sizeof dest);
sam_connectback(session, stream_id, dest);
} else if(ARG_IS(1,"RECEIVED")) {
size_t size;
void *data;
p = strrchr(s, '='); /* SIZE= */
assert(p != NULL);
p++;
size = strtol(p, NULL, 10);
assert(size != 0);
p -= 6;
*p = '\0';
p = strrchr(s, '='); /* ID= */
assert(p != NULL);
p++;
stream_id = strtol(p, NULL, 10);
assert(stream_id != 0);
data = malloc(size + 1); /* +1 for NUL termination, so when we are
receiving a string it will just work and it
won't be necessary to send NUL. When binary
data is sent, the extra NUL character will
just be ignored by the client program,
because it is not added to the size */
arg = ARG_FIND("SIZE");
assert(arg != NULL);
size = strtol_checked(string_data(arg->value));
data = malloc(size + 1);
/* +1 for NUL termination, so when we are
receiving a string it will just work and it
won't be necessary to send NUL. When binary
data is sent, the extra NUL character will
just be ignored by the client program,
because it is not added to the size */
if (data == NULL) {
SAMLOGS("Out of memory");
abort();
}
if (sam_read2(session, data, size) != -1) {
p = data + size;
char* p = data + size;
*p = '\0'; /* see above NUL note */
sam_databack(session, stream_id, data, size);
/* ^^^ `data' must be freed ^^^*/
} else
free(data);
return;
} else if (strncmp(s, SAM_STREAM_STATUS_REPLY,
strlen(SAM_STREAM_STATUS_REPLY)) == 0) {
char *p;
sam_sid_t stream_id;
p = strchr(s, '='); /* can't use strrchar because of option MESSAGE= */
assert(p != NULL); /* RESULT= */
p++;
p = strchr(p, '='); /* ID= */
assert(p != NULL);
p++;
stream_id = strtol(p, NULL, 10);
assert(stream_id != 0);
if (strncmp(s, SAM_STREAM_STATUS_REPLY_OK,
strlen(SAM_STREAM_STATUS_REPLY_OK)) == 0)
sam_statusback(session, stream_id, SAM_OK);
else if (strncmp(s, SAM_STREAM_STATUS_REPLY_CRP,
strlen(SAM_STREAM_STATUS_REPLY_CRP)) == 0)
sam_statusback(session, stream_id, SAM_CANT_REACH_PEER);
else if (strncmp(s, SAM_STREAM_STATUS_REPLY_I2E,
strlen(SAM_STREAM_STATUS_REPLY_I2E)) == 0)
sam_statusback(session, stream_id, SAM_I2P_ERROR);
else if (strncmp(s, SAM_STREAM_STATUS_REPLY_IK,
strlen(SAM_STREAM_STATUS_REPLY_IK)) == 0)
sam_statusback(session, stream_id, SAM_INVALID_KEY);
else if (strncmp(s, SAM_STREAM_STATUS_REPLY_TO,
strlen(SAM_STREAM_STATUS_REPLY_TO)) == 0)
sam_statusback(session, stream_id, SAM_TIMEOUT);
else
sam_statusback(session, stream_id, SAM_UNKNOWN);
return;
} else
SAMLOG("Unknown SAM command received: %s", s);
return;
} else if(ARG_IS(1,"STATUS")) {
arg = ARG_FIND("RESULT");
assert(arg != NULL);
if (string_is(arg->value,"OK")) {
sam_statusback(session, stream_id, SAM_OK, message);
} else if (string_is(arg->value,"CANT_REACH_PEER")) {
sam_statusback(session, stream_id,
SAM_CANT_REACH_PEER, message);
} else if (string_is(arg->value,"I2P_ERROR")) {
sam_statusback(session, stream_id, SAM_I2P_ERROR, message);
} else if (string_is(arg->value,"INVALID_KEY")) {
sam_statusback(session, stream_id, SAM_INVALID_KEY, message);
} else if (string_is(arg->value,"TIMEOUT")) {
sam_statusback(session, stream_id, SAM_TIMEOUT, message);
} else {
sam_statusback(session, stream_id, SAM_UNKNOWN, message);
}
}
} else
return 0;
return -1;
}
#undef ARG_IS
#undef ARG_FIND
/*
* Sends data to a destination in a raw packet
*

128
apps/sam/c/src/tinystring.c Normal file
View File

@@ -0,0 +1,128 @@
#include "tinystring.h"
#include <assert.h>
#include <stdarg.h>
#include <stdio.h>
#include <malloc.h>
#define _GNU_SOURCE
#include <string.h>
#ifndef min
#define min(a,b) ((a) > (b) ? (b) : (a))
#endif
extern char *strndup(const char *s, size_t n);
struct string_s {
const char* data;
long int size;
bool _no_del; //SIGH...
};
string_t string_ncreate(const char* cstr,long int length) {
string_t self = malloc(sizeof(struct string_s));
self->size = length;
if(cstr) self->data = strndup(cstr,length);
else self->data = NULL;
self->_no_del = 0;
return self;
}
string_t string_create(const char* cstr) {
if(!cstr)
return string_ncreate(NULL, 0);
return string_ncreate(cstr, strlen(cstr));
}
string_t string_nwrap(const char* cstr, long int length) {
static struct string_s self;
self.size = length;
self.data = cstr;
self._no_del = 1;
return &self;
}
string_t string_wrap(const char* cstr) {
if(!cstr)
return string_nwrap(NULL, 0);
return string_nwrap(cstr, strlen(cstr));
}
string_t string_fmt(const char* fmt, ...) {
va_list args;
FILE* tmp = tmpfile();
string_t self = malloc(sizeof(struct string_s));
char* data;
va_start(args, fmt);
vfprintf(tmp, fmt, args);
va_end(args);
self->size = ftell(tmp);
rewind(tmp);
data = malloc(self->size);
fread(data, self->size, sizeof(char), tmp);
fclose(tmp);
self->data = data;
return self;
}
string_t string_cat(string_t head,string_t tail) {
//There are two ways to skin a cat...
string_t self = malloc(sizeof(struct string_s));
char* data;
self->size = head->size+tail->size;
data = malloc(self->size);
memcpy(data, head->data, head->size);
memcpy(data+head->size,tail->data,tail->size);
self->data = data;
return self;
}
/* Source Dest */
void string_copy(string_t src,string_t dest) {
dest->data = realloc((char*)dest->data,src->size);
memcpy((char*)dest->data,src->data,dest->size);
}
void string_copy_raw(string_t src, void* dest,size_t size) {
size = min(src->size,size);
memcpy(dest,src->data,size);
}
const char* string_data(string_t self) {
return self->data;
}
long int string_size(string_t self) {
return self->size;
}
void string_free(string_t self) {
if(!self->_no_del)
free((char*)self->data);
free(self);
}
#ifndef min
#define min(a,b) ((a) < (b) ? (a) : (b))
#endif
bool string_equal(string_t this,string_t that) {
return !memcmp(this->data,that->data,min(this->size,that->size));
}
bool string_equali(string_t this,string_t that) {
return !strncasecmp(this->data,that->data,min(this->size,that->size));
}
int string_cmp(string_t this,string_t that) {
return memcmp(this->data,that->data,min(this->size,that->size));
}
int string_cmpi(string_t this,string_t that) {
return strncasecmp(this->data,that->data,min(this->size,that->size));
}

4
apps/sam/c/status Normal file
View File

@@ -0,0 +1,4 @@
#!/usr/bin/env perl
printf "%-8s ",uc(shift @ARGV);
print join(' ', @ARGV),"\n";

View File

@@ -3,7 +3,7 @@
<target name="bin" description="Builds assemblies from source">
<mkdir dir="bin" />
<csc target="dll" output="bin/sam-sharp.dll">
<csc target="library" output="bin/sam-sharp.dll">
<sources>
<include name="src/**/*.cs" />
</sources>

View File

@@ -65,14 +65,15 @@ public class Connection {
private Object _connectLock;
/** how many messages have been resent and not yet ACKed? */
private int _activeResends;
private ConEvent _connectionEvent;
private long _lifetimeBytesSent;
private long _lifetimeBytesReceived;
private long _lifetimeDupMessageSent;
private long _lifetimeDupMessageReceived;
public static final long MAX_RESEND_DELAY = 20*1000;
public static final long MIN_RESEND_DELAY = 10*1000;
public static final long MAX_RESEND_DELAY = 5*1000;
public static final long MIN_RESEND_DELAY = 3*1000;
/** wait up to 5 minutes after disconnection so we can ack/close packets */
public static int DISCONNECT_TIMEOUT = 5*60*1000;
@@ -116,9 +117,12 @@ public class Connection {
_connectLock = new Object();
_activeResends = 0;
_resetSentOn = -1;
_connectionEvent = new ConEvent();
_context.statManager().createRateStat("stream.con.windowSizeAtCongestion", "How large was our send window when we send a dup?", "Stream", new long[] { 60*1000, 10*60*1000, 60*60*1000 });
_context.statManager().createRateStat("stream.chokeSizeBegin", "How many messages were outstanding when we started to choke?", "Stream", new long[] { 60*1000, 10*60*1000, 60*60*1000 });
_context.statManager().createRateStat("stream.chokeSizeEnd", "How many messages were outstanding when we stopped being choked?", "Stream", new long[] { 60*1000, 10*60*1000, 60*60*1000 });
if (_log.shouldLog(Log.DEBUG))
_log.debug("New connection created with options: " + _options);
}
public long getNextOutboundPacketNum() {
@@ -152,7 +156,8 @@ public class Connection {
if (!_connected)
return false;
started = true;
if ( (_outboundPackets.size() >= _options.getWindowSize()) || (_activeResends > 0) ) {
if ( (_outboundPackets.size() >= _options.getWindowSize()) || (_activeResends > 0) ||
(_lastSendId - _highestAckedThrough > _options.getWindowSize()) ) {
if (writeExpire > 0) {
if (timeLeft <= 0) {
_log.error("Outbound window is full of " + _outboundPackets.size()
@@ -267,10 +272,13 @@ public class Connection {
SimpleTimer.getInstance().addEvent(new ResendPacketEvent(packet), timeout);
}
_context.statManager().getStatLog().addData(Packet.toId(_sendStreamId), "stream.rtt", _options.getRTT(), _options.getWindowSize());
_lastSendTime = _context.clock().now();
_outboundQueue.enqueue(packet);
resetActivityTimer();
/*
if (ackOnly) {
// ACK only, don't schedule this packet for retries
// however, if we are running low on sessionTags we want to send
@@ -281,6 +289,7 @@ public class Connection {
_connectionManager.ping(_remotePeer, _options.getRTT()*2, false, packet.getKeyUsed(), packet.getTagsSent(), new PingNotifier());
}
}
*/
}
private class PingNotifier implements ConnectionManager.PingNotifier {
@@ -802,9 +811,29 @@ public class Connection {
buf.append(" close received");
buf.append(" acked packets ").append(getAckedPackets());
buf.append(" maxWin ").append(getOptions().getMaxWindowSize());
buf.append("]");
return buf.toString();
}
public SimpleTimer.TimedEvent getConnectionEvent() { return _connectionEvent; }
/**
* fired to reschedule event notification
*/
class ConEvent implements SimpleTimer.TimedEvent {
private Exception _addedBy;
public ConEvent() {
//_addedBy = new Exception("added by");
}
public void timeReached() {
//if (_log.shouldLog(Log.DEBUG))
// _log.debug("firing event on " + _connection, _addedBy);
eventOccurred();
}
public String toString() { return "event on " + Connection.this.toString(); }
}
/**
* Coordinate the resends of a given packet
@@ -864,14 +893,15 @@ public class Connection {
newWindowSize /= 2;
if (newWindowSize <= 0)
newWindowSize = 1;
if (_log.shouldLog(Log.WARN))
_log.warn("Congestion resending packet " + _packet.getSequenceNum() + ": new windowSize " + newWindowSize
+ ") for " + Connection.this.toString());
// setRTT has its own ceiling
getOptions().setRTT(getOptions().getRTT() + 10*1000);
getOptions().setWindowSize(newWindowSize);
if (_log.shouldLog(Log.WARN))
_log.warn("Congestion resending packet " + _packet.getSequenceNum() + ": new windowSize " + newWindowSize
+ "/" + getOptions().getWindowSize() + ") for " + Connection.this.toString());
windowAdjusted();
}
}

View File

@@ -68,6 +68,7 @@ public class ConnectionManager {
_context.statManager().createRateStat("stream.con.lifetimeRTT", "What is the final RTT when a stream closes?", "Stream", new long[] { 60*60*1000, 24*60*60*1000 });
_context.statManager().createRateStat("stream.con.lifetimeCongestionSeenAt", "When was the last congestion seen at when a stream closes?", "Stream", new long[] { 60*60*1000, 24*60*60*1000 });
_context.statManager().createRateStat("stream.con.lifetimeSendWindowSize", "What is the final send window size when a stream closes?", "Stream", new long[] { 60*60*1000, 24*60*60*1000 });
_context.statManager().createRateStat("stream.receiveActive", "How many streams are active when a new one is received (period being not yet dropped)", "Stream", new long[] { 60*60*1000, 24*60*60*1000 });
}
Connection getConnectionByInboundId(byte[] id) {
@@ -109,7 +110,14 @@ public class ConnectionManager {
byte receiveId[] = new byte[4];
_context.random().nextBytes(receiveId);
boolean reject = false;
int active = 0;
int total = 0;
synchronized (_connectionLock) {
total = _connectionByInboundId.size();
for (Iterator iter = _connectionByInboundId.values().iterator(); iter.hasNext(); ) {
if ( ((Connection)iter.next()).getIsConnected() )
active++;
}
if (locked_tooManyStreams()) {
reject = true;
} else {
@@ -121,12 +129,16 @@ public class ConnectionManager {
} else {
_connectionByInboundId.put(ba, oldCon);
// receiveId already taken, try another
// (need to realloc receiveId, as ba.getData() points to the old value)
receiveId = new byte[4];
_context.random().nextBytes(receiveId);
}
}
}
}
_context.statManager().addRateData("stream.receiveActive", active, total);
if (reject) {
if (_log.shouldLog(Log.WARN))
_log.warn("Refusing connection since we have exceeded our max of "
@@ -227,6 +239,8 @@ public class ConnectionManager {
}
private boolean locked_tooManyStreams() {
if (_maxConcurrentStreams <= 0) return false;
if (_connectionByInboundId.size() < _maxConcurrentStreams) return false;
int active = 0;
for (Iterator iter = _connectionByInboundId.values().iterator(); iter.hasNext(); ) {
Connection con = (Connection)iter.next();
@@ -238,8 +252,6 @@ public class ConnectionManager {
_log.info("More than 100 connections! " + active
+ " total: " + _connectionByInboundId.size());
if (_maxConcurrentStreams <= 0) return false;
if (_connectionByInboundId.size() < _maxConcurrentStreams) return false;
return (active >= _maxConcurrentStreams);
}

View File

@@ -13,6 +13,7 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
private int _receiveWindow;
private int _profile;
private int _rtt;
private int _trend[];
private int _resendDelay;
private int _sendAckDelay;
private int _maxMessageSize;
@@ -50,6 +51,8 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
public static final String PROP_CONGESTION_AVOIDANCE_GROWTH_RATE_FACTOR = "i2p.streaming.congestionAvoidanceGrowthRateFactor";
public static final String PROP_SLOW_START_GROWTH_RATE_FACTOR = "i2p.streaming.slowStartGrowthRateFactor";
private static final int TREND_COUNT = 3;
public ConnectionOptions() {
super();
}
@@ -85,6 +88,8 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
protected void init(Properties opts) {
super.init(opts);
_trend = new int[TREND_COUNT];
setConnectDelay(getInt(opts, PROP_CONNECT_DELAY, -1));
setProfile(getInt(opts, PROP_PROFILE, PROFILE_BULK));
setMaxMessageSize(getInt(opts, PROP_MAX_MESSAGE_SIZE, 4*1024));
@@ -93,13 +98,13 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
setResendDelay(getInt(opts, PROP_INITIAL_RESEND_DELAY, 1000));
setSendAckDelay(getInt(opts, PROP_INITIAL_ACK_DELAY, 500));
setWindowSize(getInt(opts, PROP_INITIAL_WINDOW_SIZE, 1));
setMaxResends(getInt(opts, PROP_MAX_RESENDS, 5));
setMaxResends(getInt(opts, PROP_MAX_RESENDS, 10));
setWriteTimeout(getInt(opts, PROP_WRITE_TIMEOUT, -1));
setInactivityTimeout(getInt(opts, PROP_INACTIVITY_TIMEOUT, 5*60*1000));
setInactivityAction(getInt(opts, PROP_INACTIVITY_ACTION, INACTIVITY_ACTION_DISCONNECT));
setInboundBufferSize(getMaxMessageSize() * (Connection.MAX_WINDOW_SIZE + 2));
setCongestionAvoidanceGrowthRateFactor(getInt(opts, PROP_CONGESTION_AVOIDANCE_GROWTH_RATE_FACTOR, 2));
setSlowStartGrowthRateFactor(getInt(opts, PROP_SLOW_START_GROWTH_RATE_FACTOR, 2));
setCongestionAvoidanceGrowthRateFactor(getInt(opts, PROP_CONGESTION_AVOIDANCE_GROWTH_RATE_FACTOR, 1));
setSlowStartGrowthRateFactor(getInt(opts, PROP_SLOW_START_GROWTH_RATE_FACTOR, 1));
setConnectTimeout(getInt(opts, PROP_CONNECT_TIMEOUT, Connection.DISCONNECT_TIMEOUT));
setMaxWindowSize(getInt(opts, PROP_MAX_WINDOW_SIZE, Connection.MAX_WINDOW_SIZE));
@@ -125,7 +130,7 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
if (opts.containsKey(PROP_INITIAL_WINDOW_SIZE))
setWindowSize(getInt(opts, PROP_INITIAL_WINDOW_SIZE, 1));
if (opts.containsKey(PROP_MAX_RESENDS))
setMaxResends(getInt(opts, PROP_MAX_RESENDS, 5));
setMaxResends(getInt(opts, PROP_MAX_RESENDS, 10));
if (opts.containsKey(PROP_WRITE_TIMEOUT))
setWriteTimeout(getInt(opts, PROP_WRITE_TIMEOUT, -1));
if (opts.containsKey(PROP_INACTIVITY_TIMEOUT))
@@ -186,11 +191,36 @@ public class ConnectionOptions extends I2PSocketOptionsImpl {
*/
public int getRTT() { return _rtt; }
public void setRTT(int ms) {
synchronized (_trend) {
_trend[0] = _trend[1];
_trend[1] = _trend[2];
if (ms > _rtt)
_trend[2] = 1;
else if (ms < _rtt)
_trend[2] = -1;
else
_trend[2] = 0;
}
_rtt = ms;
if (_rtt > 60*1000)
_rtt = 60*1000;
}
/**
* If we have 3 consecutive rtt increases, we are trending upwards (1), or if we have
* 3 consecutive rtt decreases, we are trending downwards (-1), else we're stable.
*
*/
public int getRTTTrend() {
synchronized (_trend) {
for (int i = 0; i < TREND_COUNT - 1; i++) {
if (_trend[i] != _trend[i+1])
return 0;
}
return _trend[0];
}
}
/** rtt = rtt*RTT_DAMPENING + (1-RTT_DAMPENING)*currentPacketRTT */
private static final double RTT_DAMPENING = 0.9;

View File

@@ -26,6 +26,7 @@ public class ConnectionPacketHandler {
_context.statManager().createRateStat("stream.con.packetsAckedPerMessageReceived", "Size of a duplicate message received on a connection", "Stream", new long[] { 60*1000, 10*60*1000, 60*60*1000 });
_context.statManager().createRateStat("stream.sendsBeforeAck", "How many times a message was sent before it was ACKed?", "Stream", new long[] { 10*60*1000, 60*60*1000 });
_context.statManager().createRateStat("stream.resetReceived", "How many messages had we sent successfully before receiving a RESET?", "Stream", new long[] { 60*60*1000, 24*60*60*1000 });
_context.statManager().createRateStat("stream.trend", "What direction the RTT is trending in (with period = windowsize)", "Stream", new long[] { 60*1000, 60*60*1000 });
}
/** distribute a packet to the connection specified */
@@ -33,7 +34,7 @@ public class ConnectionPacketHandler {
boolean ok = verifyPacket(packet, con);
if (!ok) {
if ( (!packet.isFlagSet(Packet.FLAG_RESET)) && (_log.shouldLog(Log.ERROR)) )
_log.error("Packet does NOT verify: " + packet);
_log.error("Packet does NOT verify: " + packet + " on " + con);
packet.releasePayload();
return;
}
@@ -61,7 +62,7 @@ public class ConnectionPacketHandler {
con.getOutputStream().setBufferSize(packet.getOptionalMaxSize());
}
}
con.packetReceived();
boolean choke = false;
@@ -91,7 +92,20 @@ public class ConnectionPacketHandler {
_context.statManager().addRateData("stream.con.receiveMessageSize", packet.getPayloadSize(), 0);
boolean isNew = con.getInputStream().messageReceived(packet.getSequenceNum(), packet.getPayload());
boolean isNew = false;
boolean allowAck = true;
if ( (!packet.isFlagSet(Packet.FLAG_SYNCHRONIZE)) &&
( (packet.getSendStreamId() == null) ||
(packet.getReceiveStreamId() == null) ||
(DataHelper.eq(packet.getSendStreamId(), Packet.STREAM_ID_UNKNOWN)) ||
(DataHelper.eq(packet.getReceiveStreamId(), Packet.STREAM_ID_UNKNOWN)) ) )
allowAck = false;
if (allowAck)
isNew = con.getInputStream().messageReceived(packet.getSequenceNum(), packet.getPayload());
else
isNew = con.getInputStream().messageReceived(con.getInputStream().getHighestReadyBockId(), null);
if ( (packet.getSequenceNum() == 0) && (packet.getPayloadSize() > 0) ) {
if (_log.shouldLog(Log.DEBUG))
@@ -167,14 +181,31 @@ public class ConnectionPacketHandler {
// non-ack message payloads are queued in the MessageInputStream
packet.releasePayload();
}
//if (choke)
// con.fastRetransmit();
}
private boolean ack(Connection con, long ackThrough, long nacks[], Packet packet, boolean isNew, boolean choke) {
if ( (nacks != null) && (nacks.length > 0) )
con.getOptions().setRTT(con.getOptions().getRTT() + nacks.length*1000);
//if ( (nacks != null) && (nacks.length > 0) )
// con.getOptions().setRTT(con.getOptions().getRTT() + nacks.length*1000);
int numResends = 0;
List acked = con.ackPackets(ackThrough, nacks);
List acked = null;
// if we don't know the streamIds for both sides of the connection, there's no way we
// could actually be acking data (this fixes the buggered up ack of packet 0 problem).
// this is called after packet verification, which places the stream IDs as necessary if
// the SYN verifies (so if we're acking w/out stream IDs, no SYN has been received yet)
if ( (packet != null) && (packet.getSendStreamId() != null) && (packet.getReceiveStreamId() != null) &&
(con != null) && (con.getSendStreamId() != null) && (con.getReceiveStreamId() != null) &&
(!DataHelper.eq(packet.getSendStreamId(), Packet.STREAM_ID_UNKNOWN)) &&
(!DataHelper.eq(packet.getReceiveStreamId(), Packet.STREAM_ID_UNKNOWN)) &&
(!DataHelper.eq(con.getSendStreamId(), Packet.STREAM_ID_UNKNOWN)) &&
(!DataHelper.eq(con.getReceiveStreamId(), Packet.STREAM_ID_UNKNOWN)) )
acked = con.ackPackets(ackThrough, nacks);
else
return false;
if ( (acked != null) && (acked.size() > 0) ) {
if (_log.shouldLog(Log.DEBUG))
_log.debug(acked.size() + " of our packets acked with " + packet);
@@ -224,15 +255,17 @@ public class ConnectionPacketHandler {
oldSize >>>= 1;
if (oldSize <= 0)
oldSize = 1;
if (_log.shouldLog(Log.DEBUG))
_log.debug("Congestion occurred - new windowSize " + oldSize + " congestionSeenAt: "
+ con.getLastCongestionSeenAt() + " (#resends: " + numResends
+ ") for " + con);
// setRTT has its own ceiling
con.getOptions().setRTT(con.getOptions().getRTT() + 10*1000);
con.getOptions().setWindowSize(oldSize);
if (_log.shouldLog(Log.DEBUG))
_log.debug("Congestion occurred - new windowSize " + oldSize + " / " + con.getOptions().getWindowSize() + " congestionSeenAt: "
+ con.getLastCongestionSeenAt() + " (#resends: " + numResends
+ ") for " + con);
congested = true;
}
@@ -242,8 +275,13 @@ public class ConnectionPacketHandler {
int oldWindow = con.getOptions().getWindowSize();
int newWindowSize = oldWindow;
int trend = con.getOptions().getRTTTrend();
_context.statManager().addRateData("stream.trend", trend, newWindowSize);
if ( (!congested) && (acked > 0) && (numResends <= 0) ) {
if (newWindowSize > con.getLastCongestionSeenAt() / 2) {
if ( (newWindowSize > con.getLastCongestionSeenAt() / 2) ||
(trend > 0) ) { // tcp vegas: avoidance if rtt is increasing, even if we arent at ssthresh/2 yet
// congestion avoidance
// we can't use newWindowSize += 1/newWindowSize, since we're
@@ -253,7 +291,7 @@ public class ConnectionPacketHandler {
newWindowSize += 1;
} else {
// slow start, but modified to take into account the fact
// that windows in the streaming lib are messages, not bytes,
// that windows in the streaming lib are messages, not bytes,
// so we only grow 1 every N times (where N = the slow start factor)
int shouldIncrement = _context.random().nextInt(con.getOptions().getSlowStartGrowthRateFactor());
if (shouldIncrement <= 0)
@@ -263,13 +301,14 @@ public class ConnectionPacketHandler {
if (newWindowSize <= 0)
newWindowSize = 1;
if (_log.shouldLog(Log.DEBUG))
_log.debug("New window size " + newWindowSize + "/" + oldWindow + " congestionSeenAt: "
+ con.getLastCongestionSeenAt() + " (#resends: " + numResends
+ ") for " + con);
con.getOptions().setWindowSize(newWindowSize);
con.setCongestionWindowEnd(newWindowSize + lowest);
if (_log.shouldLog(Log.DEBUG))
_log.debug("New window size " + newWindowSize + "/" + oldWindow + "/" + con.getOptions().getWindowSize() + " congestionSeenAt: "
+ con.getLastCongestionSeenAt() + " (#resends: " + numResends
+ ") for " + con);
}
con.windowAdjusted();
@@ -299,16 +338,16 @@ public class ConnectionPacketHandler {
if (packet.getSequenceNum() <= 2) {
return true;
} else {
if (_log.shouldLog(Log.WARN))
_log.warn("Packet without RST or SYN where we dont know stream ID: "
if (_log.shouldLog(Log.ERROR))
_log.error("Packet without RST or SYN where we dont know stream ID: "
+ packet);
return false;
}
}
} else {
if (!DataHelper.eq(con.getSendStreamId(), packet.getReceiveStreamId())) {
if (_log.shouldLog(Log.WARN))
_log.warn("Packet received with the wrong reply stream id: "
if (_log.shouldLog(Log.ERROR))
_log.error("Packet received with the wrong reply stream id: "
+ con + " / " + packet);
return false;
} else {
@@ -325,8 +364,8 @@ public class ConnectionPacketHandler {
if (DataHelper.eq(con.getReceiveStreamId(), packet.getSendStreamId())) {
boolean ok = packet.verifySignature(_context, packet.getOptionalFrom(), null);
if (!ok) {
if (_log.shouldLog(Log.WARN))
_log.warn("Received unsigned / forged RST on " + con);
if (_log.shouldLog(Log.ERROR))
_log.error("Received unsigned / forged RST on " + con);
return;
} else {
if (_log.shouldLog(Log.DEBUG))

View File

@@ -91,8 +91,8 @@ public class MessageHandler implements I2PSessionListener {
*
*/
public void errorOccurred(I2PSession session, String message, Throwable error) {
if (_log.shouldLog(Log.ERROR))
_log.error("error occurred: " + message + "- " + error.getMessage());
if (_log.shouldLog(Log.WARN))
_log.warn("error occurred: " + message + "- " + error.getMessage());
if (_log.shouldLog(Log.WARN))
_log.warn("cause", error);
//_manager.disconnectAllHard();

View File

@@ -202,7 +202,7 @@ public class MessageInputStream extends InputStream {
public boolean messageReceived(long messageId, ByteArray payload) {
synchronized (_dataLock) {
if (_log.shouldLog(Log.DEBUG))
_log.debug("received " + messageId + " with " + payload.getValid());
_log.debug("received " + messageId + " with " + (payload != null ? payload.getValid()+"" : "no payload"));
if (messageId <= _highestReadyBlockId) {
if (_log.shouldLog(Log.DEBUG))
_log.debug("ignoring dup message " + messageId);

View File

@@ -38,6 +38,10 @@ public class MessageOutputStream extends OutputStream {
* size
*/
private volatile int _nextBufferSize;
// rate calc helpers
private long _sendPeriodBeginTime;
private long _sendPeriodBytes;
private int _sendBps;
public MessageOutputStream(I2PAppContext ctx, DataReceiver receiver) {
this(ctx, receiver, Packet.MAX_PAYLOAD_SIZE);
@@ -55,6 +59,10 @@ public class MessageOutputStream extends OutputStream {
_writeTimeout = -1;
_passiveFlushDelay = 500;
_nextBufferSize = -1;
_sendPeriodBeginTime = ctx.clock().now();
_sendPeriodBytes = 0;
_sendBps = 0;
_context.statManager().createRateStat("stream.sendBps", "How fast we pump data through the stream", "Stream", new long[] { 60*1000, 5*60*1000, 60*60*1000 });
_flusher = new Flusher();
if (_log.shouldLog(Log.DEBUG))
_log.debug("MessageOutputStream created");
@@ -137,6 +145,21 @@ public class MessageOutputStream extends OutputStream {
if ( (elapsed > 10*1000) && (_log.shouldLog(Log.DEBUG)) )
_log.debug("wtf, took " + elapsed + "ms to write to the stream?", new Exception("foo"));
throwAnyError();
updateBps(len);
}
private void updateBps(int len) {
long now = _context.clock().now();
int periods = (int)Math.floor((now - _sendPeriodBeginTime) / 1000d);
if (periods > 0) {
// first term decays on slow transmission
_sendBps = (int)(((float)0.9f*((float)_sendBps/(float)periods)) + ((float)0.1f*((float)_sendPeriodBytes/(float)periods)));
_sendPeriodBytes = len;
_sendPeriodBeginTime = now;
_context.statManager().addRateData("stream.sendBps", _sendBps, 0);
} else {
_sendPeriodBytes += len;
}
}
public void write(int b) throws IOException {

View File

@@ -578,7 +578,7 @@ public class Packet {
return buf;
}
private static final String toId(byte id[]) {
static final String toId(byte id[]) {
if (id == null)
return Base64.encode(STREAM_ID_UNKNOWN);
else

View File

@@ -119,6 +119,19 @@ public class PacketHandler {
}
private void receiveKnownCon(Connection con, Packet packet) {
if (packet.isFlagSet(Packet.FLAG_ECHO)) {
if (packet.getSendStreamId() != null) {
receivePing(packet);
} else if (packet.getReceiveStreamId() != null) {
receivePong(packet);
} else {
if (_log.shouldLog(Log.WARN))
_log.warn("Echo packet received with no stream IDs: " + packet);
}
packet.releasePayload();
return;
}
// the packet is pointed at a stream ID we're receiving on
if (isValidMatch(con.getSendStreamId(), packet.getReceiveStreamId())) {
// the packet's receive stream ID also matches what we expect
@@ -163,8 +176,19 @@ public class PacketHandler {
} else {
if (!con.getResetSent()) {
// someone is sending us a packet on the wrong stream
if (_log.shouldLog(Log.WARN))
_log.warn("Received a packet on the wrong stream: " + packet + " connection: " + con);
if (_log.shouldLog(Log.ERROR)) {
Set cons = _manager.listConnections();
StringBuffer buf = new StringBuffer(512);
buf.append("Received a packet on the wrong stream: ");
buf.append(packet);
buf.append(" connection: ");
buf.append(con);
for (Iterator iter = cons.iterator(); iter.hasNext();) {
Connection cur = (Connection)iter.next();
buf.append(" ").append(cur);
}
_log.error(buf.toString(), new Exception("Wrong stream"));
}
}
packet.releasePayload();
}

View File

@@ -17,21 +17,6 @@ abstract class SchedulerImpl implements TaskScheduler {
}
protected void reschedule(long msToWait, Connection con) {
SimpleTimer.getInstance().addEvent(new ConEvent(con), msToWait);
}
private class ConEvent implements SimpleTimer.TimedEvent {
private Connection _connection;
private Exception _addedBy;
public ConEvent(Connection con) {
_connection = con;
//_addedBy = new Exception("added by");
}
public void timeReached() {
//if (_log.shouldLog(Log.DEBUG))
// _log.debug("firing event on " + _connection, _addedBy);
_connection.eventOccurred();
}
public String toString() { return "event on " + _connection; }
SimpleTimer.getInstance().addEvent(con.getConnectionEvent(), msToWait);
}
}

View File

@@ -1,239 +1,239 @@
/*
* Created on Nov 9, 2004
*
* This file is part of susimail project, see http://susi.i2p/
*
* Copyright (C) 2004-2005 <susi23@mail.i2p>
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*
* $Revision: 1.1 $
*/
package i2p.susi.webmail;
import i2p.susi.util.Config;
import i2p.susi.util.ReadBuffer;
import i2p.susi.webmail.encoding.Encoding;
import i2p.susi.webmail.encoding.EncodingFactory;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.InputStreamReader;
import java.text.DateFormat;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.Iterator;
import java.util.Locale;
/**
* data structure to hold a single message, mostly used with folder view and sorting
*
* @author susi
*/
public class Mail {
public static final String DATEFORMAT = "date.format";
public static final String unknown = "unknown";
public int id, size;
public String sender, reply, subject, dateString,
formattedSender, formattedSubject, formattedDate,
shortSender, shortSubject, quotedDate, uidl;
public Date date;
public ReadBuffer header, body;
public MailPart part;
Object[] to, cc;
public String error;
public boolean markForDeletion;
public boolean deleted;
public Mail() {
id = 0;
size = 0;
formattedSender = unknown;
formattedSubject = unknown;
formattedDate = unknown;
shortSender = unknown;
shortSubject = unknown;
quotedDate = unknown;
error = "";
}
/**
*
* @param address
* @return
*/
public static boolean validateAddress( String address )
{
if( address == null || address.length() == 0 )
return false;
address = address.trim();
if( address.indexOf( "\n" ) != -1 ||
address.indexOf( "\r" ) != -1 )
return false;
String[] tokens = address.split( "[ \t]+" );
int addresses = 0;
for( int i = 0; i < tokens.length; i++ ) {
if( tokens[i].matches( "^[^@< \t]+@[^> \t]+$" ) ||
tokens[i].matches( "^<[^@< \t]+@[^> \t]+>$" ) )
addresses++;
}
return addresses == 1;
}
/**
* @param address
* @return
*/
public static String getAddress(String address )
{
String[] tokens = address.split( "[ \t]+" );
for( int i = 0; i < tokens.length; i++ ) {
if( tokens[i].matches( "^[^@< \t]+@[^> \t]+$" ) )
return "<" + tokens[i] + ">";
if( tokens[i].matches( "^<[^@< \t]+@[^> \t]+>$" ) )
return tokens[i];
}
return null;
}
public static boolean getRecipientsFromList( ArrayList recipients, String text, boolean ok )
{
if( text != null && text.length() > 0 ) {
String[] ccs = text.split( "," );
for( int i = 0; i < ccs.length; i++ ) {
String recipient = ccs[i].trim();
if( validateAddress( recipient ) ) {
String str = getAddress( recipient );
if( str != null && str.length() > 0 ) {
recipients.add( str );
}
else {
ok = false;
}
}
else {
ok = false;
}
}
}
return ok;
}
public static void appendRecipients( StringBuffer buf, ArrayList recipients, String prefix )
{
for( Iterator it = recipients.iterator(); it.hasNext(); ) {
buf.append( prefix );
prefix ="\t";
buf.append( (String)it.next() );
buf.append( "\r\n" );
}
}
public void parseHeaders()
{
DateFormat dateFormatter = new SimpleDateFormat( Config.getProperty( DATEFORMAT, "mm/dd/yyyy HH:mm:ss" ) );
DateFormat mailDateFormatter = new SimpleDateFormat("EEE, d MMM yyyy HH:mm:ss Z", Locale.ENGLISH );
error = "";
if( header != null ) {
boolean ok = true;
Encoding html = EncodingFactory.getEncoding( "HTML" );
if( html == null ) {
error += "HTML encoder not found.<br>";
ok = false;
}
Encoding hl = EncodingFactory.getEncoding( "HEADERLINE" );
if( hl == null ) {
error += "Header line encoder not found.<br>";
ok = false;
}
if( ok ) {
try {
ReadBuffer decoded = hl.decode( header );
BufferedReader reader = new BufferedReader( new InputStreamReader( new ByteArrayInputStream( decoded.content, decoded.offset, decoded.length ), "ISO-8859-1" ) );
String line;
while( ( line = reader.readLine() ) != null ) {
if( line.length() == 0 )
break;
if( line.startsWith( "From:" ) ) {
sender = line.substring( 5 ).trim();
formattedSender = getAddress( sender );
shortSender = formattedSender.trim();
if( shortSender.length() > 40 ) {
shortSender = shortSender.substring( 0, 37 ).trim() + "...";
}
shortSender = html.encode( shortSender );
}
else if( line.startsWith( "Date:" ) ) {
dateString = line.substring( 5 ).trim();
try {
date = mailDateFormatter.parse( dateString );
formattedDate = dateFormatter.format( date );
quotedDate = html.encode( dateString );
}
catch (ParseException e) {
date = null;
e.printStackTrace();
}
}
else if( line.startsWith( "Subject:" ) ) {
subject = line.substring( 8 ).trim();
formattedSubject = subject;
shortSubject = formattedSubject;
if( formattedSubject.length() > 60 )
shortSubject = formattedSubject.substring( 0, 57 ).trim() + "...";
shortSubject = html.encode( shortSubject );
}
else if( line.toLowerCase().startsWith( "Reply-To:" ) ) {
reply = Mail.getAddress( line.substring( 9 ).trim() );
}
else if( line.startsWith( "To:" ) ) {
ArrayList list = new ArrayList();
Mail.getRecipientsFromList( list, line.substring( 3 ).trim(), true );
to = list.toArray();
}
else if( line.startsWith( "Cc:" ) ) {
ArrayList list = new ArrayList();
Mail.getRecipientsFromList( list, line.substring( 3 ).trim(), true );
cc = list.toArray();
}
}
}
catch( Exception e ) {
error += "Error parsing mail header: " + e.getClass().getName() + "<br>";
}
}
}
}
}
/*
* Created on Nov 9, 2004
*
* This file is part of susimail project, see http://susi.i2p/
*
* Copyright (C) 2004-2005 <susi23@mail.i2p>
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*
* $Revision: 1.2 $
*/
package i2p.susi.webmail;
import i2p.susi.util.Config;
import i2p.susi.util.ReadBuffer;
import i2p.susi.webmail.encoding.Encoding;
import i2p.susi.webmail.encoding.EncodingFactory;
import java.io.BufferedReader;
import java.io.ByteArrayInputStream;
import java.io.InputStreamReader;
import java.text.DateFormat;
import java.text.ParseException;
import java.text.SimpleDateFormat;
import java.util.ArrayList;
import java.util.Date;
import java.util.Iterator;
import java.util.Locale;
/**
* data structure to hold a single message, mostly used with folder view and sorting
*
* @author susi
*/
public class Mail {
public static final String DATEFORMAT = "date.format";
public static final String unknown = "unknown";
public int id, size;
public String sender, reply, subject, dateString,
formattedSender, formattedSubject, formattedDate,
shortSender, shortSubject, quotedDate, uidl;
public Date date;
public ReadBuffer header, body;
public MailPart part;
Object[] to, cc;
public String error;
public boolean markForDeletion;
public boolean deleted;
public Mail() {
id = 0;
size = 0;
formattedSender = unknown;
formattedSubject = unknown;
formattedDate = unknown;
shortSender = unknown;
shortSubject = unknown;
quotedDate = unknown;
error = "";
}
/**
*
* @param address
* @return
*/
public static boolean validateAddress( String address )
{
if( address == null || address.length() == 0 )
return false;
address = address.trim();
if( address.indexOf( "\n" ) != -1 ||
address.indexOf( "\r" ) != -1 )
return false;
String[] tokens = address.split( "[ \t]+" );
int addresses = 0;
for( int i = 0; i < tokens.length; i++ ) {
if( tokens[i].matches( "^[^@< \t]+@[^> \t]+$" ) ||
tokens[i].matches( "^<[^@< \t]+@[^> \t]+>$" ) )
addresses++;
}
return addresses == 1;
}
/**
* @param address
* @return
*/
public static String getAddress(String address )
{
String[] tokens = address.split( "[ \t]+" );
for( int i = 0; i < tokens.length; i++ ) {
if( tokens[i].matches( "^[^@< \t]+@[^> \t]+$" ) )
return "<" + tokens[i] + ">";
if( tokens[i].matches( "^<[^@< \t]+@[^> \t]+>$" ) )
return tokens[i];
}
return null;
}
public static boolean getRecipientsFromList( ArrayList recipients, String text, boolean ok )
{
if( text != null && text.length() > 0 ) {
String[] ccs = text.split( "," );
for( int i = 0; i < ccs.length; i++ ) {
String recipient = ccs[i].trim();
if( validateAddress( recipient ) ) {
String str = getAddress( recipient );
if( str != null && str.length() > 0 ) {
recipients.add( str );
}
else {
ok = false;
}
}
else {
ok = false;
}
}
}
return ok;
}
public static void appendRecipients( StringBuffer buf, ArrayList recipients, String prefix )
{
for( Iterator it = recipients.iterator(); it.hasNext(); ) {
buf.append( prefix );
prefix ="\t";
buf.append( (String)it.next() );
buf.append( "\r\n" );
}
}
public void parseHeaders()
{
DateFormat dateFormatter = new SimpleDateFormat( Config.getProperty( DATEFORMAT, "mm/dd/yyyy HH:mm:ss" ) );
DateFormat mailDateFormatter = new SimpleDateFormat("EEE, d MMM yyyy HH:mm:ss Z", Locale.ENGLISH );
error = "";
if( header != null ) {
boolean ok = true;
Encoding html = EncodingFactory.getEncoding( "HTML" );
if( html == null ) {
error += "HTML encoder not found.<br>";
ok = false;
}
Encoding hl = EncodingFactory.getEncoding( "HEADERLINE" );
if( hl == null ) {
error += "Header line encoder not found.<br>";
ok = false;
}
if( ok ) {
try {
ReadBuffer decoded = hl.decode( header );
BufferedReader reader = new BufferedReader( new InputStreamReader( new ByteArrayInputStream( decoded.content, decoded.offset, decoded.length ), "ISO-8859-1" ) );
String line;
while( ( line = reader.readLine() ) != null ) {
if( line.length() == 0 )
break;
if( line.startsWith( "From:" ) ) {
sender = line.substring( 5 ).trim();
formattedSender = getAddress( sender );
shortSender = formattedSender.trim();
if( shortSender.length() > 40 ) {
shortSender = shortSender.substring( 0, 37 ).trim() + "...";
}
shortSender = html.encode( shortSender );
}
else if( line.startsWith( "Date:" ) ) {
dateString = line.substring( 5 ).trim();
try {
date = mailDateFormatter.parse( dateString );
formattedDate = dateFormatter.format( date );
quotedDate = html.encode( dateString );
}
catch (ParseException e) {
date = null;
e.printStackTrace();
}
}
else if( line.startsWith( "Subject:" ) ) {
subject = line.substring( 8 ).trim();
formattedSubject = subject;
shortSubject = formattedSubject;
if( formattedSubject.length() > 60 )
shortSubject = formattedSubject.substring( 0, 57 ).trim() + "...";
shortSubject = html.encode( shortSubject );
}
else if( line.toLowerCase().startsWith( "Reply-To:" ) ) {
reply = Mail.getAddress( line.substring( 9 ).trim() );
}
else if( line.startsWith( "To:" ) ) {
ArrayList list = new ArrayList();
Mail.getRecipientsFromList( list, line.substring( 3 ).trim(), true );
to = list.toArray();
}
else if( line.startsWith( "Cc:" ) ) {
ArrayList list = new ArrayList();
Mail.getRecipientsFromList( list, line.substring( 3 ).trim(), true );
cc = list.toArray();
}
}
}
catch( Exception e ) {
error += "Error parsing mail header: " + e.getClass().getName() + "<br>";
}
}
}
}
}

View File

@@ -1,102 +1,102 @@
/*
* Created on Nov 23, 2004
*
* This file is part of susimail project, see http://susi.i2p/
*
* Copyright (C) 2004-2005 <susi23@mail.i2p>
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*
* $Revision: 1.1 $
*/
package i2p.susi.webmail;
import java.util.Hashtable;
import i2p.susi.webmail.pop3.POP3MailBox;
/**
* @author user
*/
public class MailCache {
public static final boolean FETCH_HEADER = true;
public static final boolean FETCH_ALL = false;
private POP3MailBox mailbox;
private String error;
private Hashtable mails;
private Object synchronizer;
MailCache( POP3MailBox mailbox ) {
this.mailbox = mailbox;
mails = new Hashtable();
synchronizer = new Object();
}
/**
* Fetch any needed data from pop3 server.
*
* @param id message id to get
* @param headerOnly fetch only header lines?
* @return
*/
public Mail getMail( String uidl, boolean headerOnly ) {
Mail mail = null, newMail = null;
if( mailbox != null ) {
/*
* synchronize update to hashtable
*/
synchronized( synchronizer ) {
mail = (Mail)mails.get( uidl );
if( mail == null ) {
newMail = new Mail();
mails.put( uidl, newMail );
}
}
if( mail == null ) {
mail = newMail;
mail.uidl = uidl;
mail.size = mailbox.getSize( uidl );
}
if( mail.size < 1024 )
headerOnly = false;
boolean parseHeaders = mail.header == null;
if( headerOnly ) {
if( mail.header == null )
mail.header = mailbox.getHeader( uidl );
}
else {
if( mail.body == null ) {
mail.body = mailbox.getBody( uidl );
if( mail.body != null ) {
mail.header = mail.body;
MailPart.parse( mail );
}
}
}
if( parseHeaders && mail.header != null )
mail.parseHeaders();
}
if( mail != null && mail.deleted )
mail = null;
return mail;
}
}
/*
* Created on Nov 23, 2004
*
* This file is part of susimail project, see http://susi.i2p/
*
* Copyright (C) 2004-2005 <susi23@mail.i2p>
*
* This program is free software; you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation; either version 2 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program; if not, write to the Free Software
* Foundation, Inc., 59 Temple Place, Suite 330, Boston, MA 02111-1307 USA
*
* $Revision: 1.2 $
*/
package i2p.susi.webmail;
import java.util.Hashtable;
import i2p.susi.webmail.pop3.POP3MailBox;
/**
* @author user
*/
public class MailCache {
public static final boolean FETCH_HEADER = true;
public static final boolean FETCH_ALL = false;
private POP3MailBox mailbox;
private String error;
private Hashtable mails;
private Object synchronizer;
MailCache( POP3MailBox mailbox ) {
this.mailbox = mailbox;
mails = new Hashtable();
synchronizer = new Object();
}
/**
* Fetch any needed data from pop3 server.
*
* @param id message id to get
* @param headerOnly fetch only header lines?
* @return
*/
public Mail getMail( String uidl, boolean headerOnly ) {
Mail mail = null, newMail = null;
if( mailbox != null ) {
/*
* synchronize update to hashtable
*/
synchronized( synchronizer ) {
mail = (Mail)mails.get( uidl );
if( mail == null ) {
newMail = new Mail();
mails.put( uidl, newMail );
}
}
if( mail == null ) {
mail = newMail;
mail.uidl = uidl;
mail.size = mailbox.getSize( uidl );
}
if( mail.size < 1024 )
headerOnly = false;
boolean parseHeaders = mail.header == null;
if( headerOnly ) {
if( mail.header == null )
mail.header = mailbox.getHeader( uidl );
}
else {
if( mail.body == null ) {
mail.body = mailbox.getBody( uidl );
if( mail.body != null ) {
mail.header = mail.body;
MailPart.parse( mail );
}
}
}
if( parseHeaders && mail.header != null )
mail.parseHeaders();
}
if( mail != null && mail.deleted )
mail = null;
return mail;
}
}

31
apps/syndie/doc/intro.sml Normal file
View File

@@ -0,0 +1,31 @@
Syndie is a new effort to build a user friendly secure blogging tool, exploiting the capabilities offered by anonymity and security systems such as [link schema="web" location="http://www.i2p.net/"]I2P[/link], [link schema="web" location="http://tor.eff.org/"]TOR[/link], [link schema="web" location="http://www.freenetproject.org/"]Freenet[/link], [link schema="web" location="http://www.mnetproject.org/"]MNet[/link], and others. Abstracting away the content distribution side, Syndie allows people to [b]build content and communities[/b] that span technologies rather than tying oneself down to the ups and downs of any particular network.
[cut][/cut]Syndie is working to take the technologies of the security, anonymity, and cryptography worlds and merge them with the simplicity and user focus of the blogging world. From the user's standpoint, you could perhaps view Syndie as a distributed [link schema="web" location="http://www.livejournal.com"]LiveJournal[/link], while technically Syndie is much, much simpler.
[b]How Syndie works[/b][hr][/hr]The [i]magic[/i] behind Syndie's abstraction is to ignore any content distribution issues and merely assume data moves around as necessary. Each Syndie instance runs agains the filesystem, verifying and indexing blogs and offering up what it knows to the user through a web interface. The core idea in Syndie, therefore, is the [b]archive[/b]- a collection of blogs categorized and ready for consumption.
Whenever someone reads or posts to a Syndie instance, it is working with the [b]local archive[/b]. However, as Syndie's development progresses, people will be able to read [b]remote archives[/b] - pulling the archive summary from an I2P [i]eepsite[/i], TOR [i]hosted service[/i], Freenet [i]Freesite[/i], MNet [i]key[/i], or (with a little less glamor) usenet, filesharing apps, or the web. The first thing Syndie needs to use a remote archive is the archive's index - a plain text file summarizing what the archive contains ([attachment id="0"]an example[/attachment]). From that, Syndie will let the user browse through the blogs, pulling the individual blog posts into the local archive when necessary.
[b]Posting[/b][hr][/hr]Creating and posting to blogs with Syndie is trivial - simply log in to Syndie, click on the [i]Post[/i] button, and fill out the form offered. Syndie handles all of the encryption and formatting details - packaging up the post with any attached files into a single signed, compressed, and potentially encrypted bundle, storing it in the local archive and capable of being shared with other Syndie users. Every blog is identified by its public key behind the scenes, so there is no need for a central authority to require that your blogs are all named uniquely or any other such thing.
While each blog is run by a single author, they can in turn allow other authors to post to the blog while still letting readers know that the post is authorized (though created by a different author). Of course, if multiple people wanted to run a single blog and make it look like only one person wrote it, they could share the blog's private keys.
[b]Tags[/b][hr][/hr]Following the lessons from the last few years, every Syndie entry has any number of tags associated with it by the author, allowing trivial categorization and filtering.
[b]Hosting[/b][hr][/hr]While in many scenarios it is best for people to run Syndie locally on their machine, Syndie is a fully multiuser system so anyone can be a Syndie hosting provider by simply exposing the web interface to the public. The Syndie host's operator can password protect the blog registration interface so only authorized people can create a blog, and the operator can technically go through and delete blog posts or even entire blogs from their local archive. A public Syndie host can be a general purpose blog repository, letting anyone sign up (following the blogger and geocities path), be a more community oriented blog repository, requiring people to introduce you to the host to sign up (following the livejournal/orkut path), be a more focused blog repository, requiring posts to stay within certain guidelines (following the indymedia path), or even to fit specialized needs by picking and choosing among the best blogs and posts out there, offering the operator's editorial flare into a comprehensive collection.
[b]Syndication[/b][hr][/hr]By itself, Syndie is a nice blogging community system, but its real strength as a tool for individual and community empowerment comes when blogs are shared. While Syndie does not aim to be a content distribution network, it does want to exploit them to allow those who require their message to get out to do so. By design, syndicating Syndie can be done with some of the most basic tools - simply pass around the self authenticating files written to the archive and you're done. The archive itself is organized so that you can expose it as an indexed directory in some webserver and let people wget against it, picking to pull individual posts, all posts within a blog, all posts since a given date, or all posts in all blogs. With a very small shell script, you could parse the plain text archive summary to pull posts by size and tag as well. People could offer up their archives as rsync repositories or package up tarballs/zipfiles of blogs or entries - simply grabbing them and extracting them into your local Syndie archive would instantly give you access to all of the content therein.
Of course, manual syndication as described above has... limits. When appropriate, Syndie will tie in to content syndication systems such as [link schema="eep" location="http://feedspace.i2p/"]Feedspace[/link] (or even good ol' Usenet) to automatically import (and export) posts. Integration with content distribution networks like Freenet and MNet will allow the user to periodically grab a published archive index and pull down blogs as necessary. Posting archives and blogs to those networks will be done trivially as well, though they do still depend upon a polling paradigm.
[b]SML[/b][hr][/hr]Syndie is meant to work securely with any browser regardless of the browser's security. Blog entries are written in [b]SML[/b] [i](Syndie or Secure Markup Language)[/i] with a bbcode-linke syntax, extended to exploit some of Syndie's capabilities and context. In addition to the SML content in a blog entry, there can be any number of attachments, references to other blogs/posts/tags, nym<->public key mappings (useful for I2P host distribution), references to archives of blogs (on eepsites, freesites, etc), links to various resources, and more.
[b]Future[/b][hr][/hr]Down the road, there are lots of things to improve with Syndie. The interface, of course, is critical, as are tools for SML authoring and improvements to SML itself to offer a more engaging user experience. Integration with a search engine like Lucene would allow full text search through entire archives, and Atom/RSS interfaces would allow trivial import and export to existing clients. Even further, blogs could be transparently encrypted, allowing only authorized users (those with the key) to read entries posted to them (or even know what attachments are included). Integration with existing blogging services (such as [link schema="web" location="http://www.anonyblog.com"]anonyblog[/link], [link schema="web" location="http://blo.gs"]blo.gs[/link], and [link schema="web" location="http://livejournal.com"]livejournal[/link]) may also be explored. Of course, bundling with I2P and other anonymity, security, and community systems will be pursued.
[b]Who/where/when/why[/b][hr][/hr]The base Syndie system was written in a few days by [blog name="jrandom" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" archive0="eep://dev.i2p/~jrandom" archive1="http://dev.i2p.net/~jrandom" archive2="mailto://jrandom@i2p.net"][/blog], though comes out of discussions with [link schema="eep" location="http://frosk.i2p"]Frosk[/link] and many others in the I2P community. Yes, this is an incarnation of [b]MyI2P[/b] (or for those who remember jrand0m's flog, [b]Flogger[/b]).
All of the Syndie code is of course open source and released into the public domain (the [i]real[/i] "free as in freedom"), though it does use some BSD licensed cryptographic routines and an Apache licensed file upload component. Contributions of code are very much welcome - the source is located within the [link schema="web" location="http://www.i2p.net/cvs"]I2P codebase[/link]. Of course, those who cannot or choose not to contribute code are encouraged to [b]use[/b] Syndie - create a blog, create some content, read some content! For those who really want to though, financial contributions to the Syndie development effort can be channeled through the [link schema="web" location="http://www.i2p.net/donate"]I2P fund[/link] (donations for Syndie are distributed to Syndie developers from time to time).
The "why" of Syndie is a much bigger question, though is hopefully self-evident. We need kickass anonymity-aware client applications so that we can get better anonymity (since without kickass clients, we don't have many users). We also need kickass tools for safe blogging, since there are limits to the strength offered by low latency anonymity systems like I2P and TOR - Syndie goes beyond them to offer an interface to mid and high latency anonymous systems while exploiting their capabilities for fast and efficient syndication.
Oh, and jrandom also lost his blog's private key, so needed something to blog with again.

View File

@@ -0,0 +1,27 @@
To install this base instance:
mkdir lib
cp ../lib/i2p.jar lib/
cp ../lib/commons-el.jar lib/
cp ../lib/commons-logging.jar lib/
cp ../lib/jasper-compiler.jar lib/
cp ../lib/jasper-runtime.jar lib/
cp ../lib/javax.servlet.jar lib/
cp ../lib/jbigi.jar lib/
cp ../lib/org.mortbay.jetty.jar lib/
cp ../lib/xercesImpl.jar lib/
To run it:
sh run.sh
firefox http://localhost:7653/syndie/
You can share your archive at http://localhost:7653/ so
that people can syndicate off you via
cd archive ; wget -m -nH http://yourmachine:7653/
You may want to add a password on the registration form
so that you have control over who can create blogs via /syndie/.
To do so, set the password in the run.sh script.
Windows users:
write your own instructions. We're alpha, here ;)

41
apps/syndie/doc/sml.sml Normal file
View File

@@ -0,0 +1,41 @@
[cut]A brief glance at SML[/cut]
[b]General rules[/b]
Newlines are newlines are newlines. If you include a newline in your SML, you'll get a newline in the rendered HTML.
All < and > characters are replaced by their HTML entity counterparts.
All SML tags are enclosed with [[ and ]] (e.g. [[b]]bold stuff[[/b]]). ([[ and ]] characters are quoted by [[[[ and ]]]], respectively)
Nesting SML tags is [b]not[/b] currently supported (though will be at a later date).
All SML tags must have a beginning and end tag (even for ones without any 'body', such as [[hr]][[/hr]]). This restriction may be removed later.
Simple formatting tags behave as expected: [[b]], [[i]], [[u]], [[h1]] through [[h5]], [[hr]], [[pre]].
[hr][/hr]
[b]Tag details[/b]
* To cut an entry so that the summary is before while the details are afterwards:
[[cut]]more inside...[[/cut]]
* To load an attachment as an image with "syndie's logo" as the alternate text:
[[img attachment="0"]]syndie's logo[[/img]]
* To add a download link to an attachment:
[[attachment id="0"]]anchor text[[/img]]
* To quote someone:
[[quote author="who you are quoting" location="blog://ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=/1234567890"]]stuff they said[[/quote]]
* To sample some code:
[[code location="eep://dev.i2p/cgi-bin/cvsweb.cgi/i2p/index.html"]]<html>[[/code]]
* To link to a [blog name="jrandom" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" blogentry="1124402137773" archive0="eep://dev.i2p/~jrandom/archive" archive1="irc2p://jrandom@irc.postman.i2p/#i2p"]bitchin' blog[/blog]:
[[blog name="the blogs name" bloghash="ovpBy2mpO1CQ7deYhQ1cDGAwI6pQzLbWOm1Sdd0W06c=" blogtag="tag" blogentry="123456789" archive0="eep://dev.i2p/~jrandom/archive/" archive1="freenet://SSK@blah/archive//"]]description of the blog[[/blog]]. blogentry and blogtag are optional and there can be any number of archiveN locations specified.
* To link to an [link schema="eep" location="http://dev.i2p/"]external resource[/link]:
[[link schema="eep" location="http://dev.i2p/"]]link to it[[/link]].
[i]The schema should be a network selection tool, such as "eep" for an eepsite, "tor" for a tor hidden service, "web" for a normal website, "freenet" for a freenet key, etc. The local user's Syndie configuration should include information necessary for the user to access the content referenced through the given schemas.[/i]
* To pass an [address name="dev.i2p" schema="eep" location="NF2RLVUxVulR3IqK0sGJR0dHQcGXAzwa6rEO4WAWYXOHw-DoZhKnlbf1nzHXwMEJoex5nFTyiNMqxJMWlY54cvU~UenZdkyQQeUSBZXyuSweflUXFqKN-y8xIoK2w9Ylq1k8IcrAFDsITyOzjUKoOPfVq34rKNDo7fYyis4kT5bAHy~2N1EVMs34pi2RFabATIOBk38Qhab57Umpa6yEoE~rbyR~suDRvD7gjBvBiIKFqhFueXsR2uSrPB-yzwAGofTXuklofK3DdKspciclTVzqbDjsk5UXfu2nTrC1agkhLyqlOfjhyqC~t1IXm-Vs2o7911k7KKLGjB4lmH508YJ7G9fLAUyjuB-wwwhejoWqvg7oWvqo4oIok8LG6ECR71C3dzCvIjY2QcrhoaazA9G4zcGMm6NKND-H4XY6tUWhpB~5GefB3YczOqMbHq4wi0O9MzBFrOJEOs3X4hwboKWANf7DT5PZKJZ5KorQPsYRSq0E3wSOsFCSsdVCKUGsAAAA"]addressbook entry[/address]:
[[address name="dev.i2p" schema="eep" location="NF2...AAAA"]]add it[[/address]].

101
apps/syndie/java/build.xml Normal file
View File

@@ -0,0 +1,101 @@
<?xml version="1.0" encoding="UTF-8"?>
<project basedir="." default="all" name="syndie">
<target name="all" depends="clean, build" />
<target name="build" depends="builddep, jar" />
<target name="builddep">
<ant dir="../../jetty/" target="build" />
<ant dir="../../../core/java/" target="build" />
<!-- ministreaming will build core -->
</target>
<target name="compile">
<mkdir dir="./build" />
<mkdir dir="./build/obj" />
<javac
srcdir="./src"
debug="true" deprecation="on" source="1.3" target="1.3"
destdir="./build/obj"
classpath="../../../core/java/build/i2p.jar:../../jetty/jettylib/org.mortbay.jetty.jar:../../jetty/jettylib/javax.servlet.jar" />
</target>
<target name="jar" depends="builddep, compile">
<jar destfile="./build/syndie.jar" basedir="./build/obj" includes="**/*.class">
<manifest>
<attribute name="Main-Class" value="net.i2p.syndie.CLI" />
<attribute name="Class-Path" value="i2p.jar" />
</manifest>
</jar>
<ant target="war" />
</target>
<target name="war" depends="builddep, compile, precompilejsp">
<war destfile="../syndie.war" webxml="../jsp/web-out.xml">
<fileset dir="../jsp/" includes="**/*" excludes=".nbintdb, web.xml, web-out.xml, web-fragment.xml, **/*.java, **/*.jsp" />
<classes dir="./build/obj" />
</war>
</target>
<target name="precompilejsp">
<delete dir="../jsp/WEB-INF/" />
<delete file="../jsp/web-fragment.xml" />
<delete file="../jsp/web-out.xml" />
<mkdir dir="../jsp/WEB-INF/" />
<mkdir dir="../jsp/WEB-INF/classes" />
<!-- there are various jspc ant tasks, but they all seem a bit flakey -->
<java classname="org.apache.jasper.JspC" fork="true" >
<classpath>
<pathelement location="../../jetty/jettylib/jasper-compiler.jar" />
<pathelement location="../../jetty/jettylib/jasper-runtime.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="../../jetty/jettylib/commons-logging.jar" />
<pathelement location="../../jetty/jettylib/commons-el.jar" />
<pathelement location="../../jetty/jettylib/org.mortbay.jetty.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="../../jetty/jettylib/ant.jar" />
<pathelement location="build/obj" />
<pathelement location="../../../core/java/build/i2p.jar" />
</classpath>
<arg value="-d" />
<arg value="../jsp/WEB-INF/classes" />
<arg value="-p" />
<arg value="net.i2p.syndie.jsp" />
<arg value="-webinc" />
<arg value="../jsp/web-fragment.xml" />
<arg value="-webapp" />
<arg value="../jsp/" />
</java>
<javac debug="true" deprecation="on" source="1.3" target="1.3"
destdir="../jsp/WEB-INF/classes/" srcdir="../jsp/WEB-INF/classes" includes="**/*.java" >
<classpath>
<pathelement location="../../jetty/jettylib/jasper-runtime.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="../../jetty/jettylib/commons-logging.jar" />
<pathelement location="../../jetty/jettylib/commons-el.jar" />
<pathelement location="../../jetty/jettylib/org.mortbay.jetty.jar" />
<pathelement location="../../jetty/jettylib/javax.servlet.jar" />
<pathelement location="build/obj" />
<pathelement location="../../../core/java/build/i2p.jar" />
</classpath>
</javac>
<copy file="../jsp/web.xml" tofile="../jsp/web-out.xml" />
<loadfile property="jspc.web.fragment" srcfile="../jsp/web-fragment.xml" />
<replace file="../jsp/web-out.xml">
<replacefilter token="&lt;!-- precompiled servlets --&gt;" value="${jspc.web.fragment}" />
</replace>
</target>
<target name="javadoc">
<mkdir dir="./build" />
<mkdir dir="./build/javadoc" />
<javadoc
sourcepath="./src:../../../core/java/src" destdir="./build/javadoc"
packagenames="*"
use="true"
splitindex="true"
windowtitle="syndie" />
</target>
<target name="clean">
<delete dir="./build" />
</target>
<target name="cleandep" depends="clean">
<ant dir="../../../core/java/" target="distclean" />
</target>
<target name="distclean" depends="clean">
<ant dir="../../../core/java/" target="distclean" />
</target>
</project>

View File

@@ -0,0 +1,418 @@
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import java.text.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
/**
* Store blog info in the local filesystem.
*
* Entries are stored under:
* $rootDir/$h(blogKey)/$entryId.snd (the index lists them as YYYYMMDD_n_jKB)
* Blog info is stored under:
* $rootDir/$h(blogKey)/meta.snm
* Archive summary is stored under
* $rootDir/archive.txt
* Any key=value pairs in
* $rootDir/archiveHeaders.txt
* are injected into the archive.txt on regeneration.
*
* When entries are loaded for extraction/verification/etc, their contents are written to
* $cacheDir/$h(blogKey)/$entryId/ (e.g. $cacheDir/$h(blogKey)/$entryId/entry.sml)
*/
public class Archive {
private I2PAppContext _context;
private File _rootDir;
private File _cacheDir;
private Map _blogInfo;
private ArchiveIndex _index;
private EntryExtractor _extractor;
private String _defaultSelector;
public static final String METADATA_FILE = "meta.snm";
public static final String INDEX_FILE = "archive.txt";
public static final String HEADER_FILE = "archiveHeaders.txt";
private static final FilenameFilter _entryFilenameFilter = new FilenameFilter() {
public boolean accept(File dir, String name) { return name.endsWith(".snd"); }
};
public Archive(I2PAppContext ctx, String rootDir, String cacheDir) {
_context = ctx;
_rootDir = new File(rootDir);
if (!_rootDir.exists())
_rootDir.mkdirs();
_cacheDir = new File(cacheDir);
if (!_cacheDir.exists())
_cacheDir.mkdirs();
_blogInfo = new HashMap();
_index = null;
_extractor = new EntryExtractor(ctx);
_defaultSelector = ctx.getProperty("syndie.defaultSelector");
if (_defaultSelector == null) _defaultSelector = "";
reloadInfo();
}
public void reloadInfo() {
File f[] = _rootDir.listFiles();
List info = new ArrayList();
for (int i = 0; i < f.length; i++) {
if (f[i].isDirectory()) {
File meta = new File(f[i], METADATA_FILE);
if (meta.exists()) {
BlogInfo bi = new BlogInfo();
try {
bi.load(new FileInputStream(meta));
if (bi.verify(_context)) {
info.add(bi);
} else {
System.err.println("Invalid blog (but we're storing it anyway): " + bi);
new Exception("foo").printStackTrace();
info.add(bi);
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
}
synchronized (_blogInfo) {
_blogInfo.clear();
for (int i = 0; i < info.size(); i++) {
BlogInfo bi = (BlogInfo)info.get(i);
_blogInfo.put(bi.getKey().calculateHash(), bi);
}
}
}
public String getDefaultSelector() { return _defaultSelector; }
public BlogInfo getBlogInfo(BlogURI uri) {
if (uri == null) return null;
synchronized (_blogInfo) {
return (BlogInfo)_blogInfo.get(uri.getKeyHash());
}
}
public BlogInfo getBlogInfo(Hash key) {
synchronized (_blogInfo) {
return (BlogInfo)_blogInfo.get(key);
}
}
public boolean storeBlogInfo(BlogInfo info) {
if (!info.verify(_context)) {
System.err.println("Not storing the invalid blog " + info);
new Exception("foo!").printStackTrace();
return false;
}
boolean isNew = true;
synchronized (_blogInfo) {
BlogInfo old = (BlogInfo)_blogInfo.get(info.getKey().calculateHash());
if ( (old == null) || (old.getEdition() < info.getEdition()) )
_blogInfo.put(info.getKey().calculateHash(), info);
else
isNew = false;
}
if (!isNew) return true; // valid entry, but not stored, since its old
try {
File blogDir = new File(_rootDir, info.getKey().calculateHash().toBase64());
blogDir.mkdirs();
File blogFile = new File(blogDir, "meta.snm");
FileOutputStream out = new FileOutputStream(blogFile);
info.write(out);
out.close();
System.out.println("Blog info written to " + blogFile.getPath());
return true;
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
public List listBlogs() {
synchronized (_blogInfo) {
return new ArrayList(_blogInfo.values());
}
}
private File getEntryDir(File entryFile) {
String name = entryFile.getName();
if (!name.endsWith(".snd")) throw new RuntimeException("hmm, why are we trying to get an entry dir for " + entryFile.getAbsolutePath());
String blog = entryFile.getParentFile().getName();
File blogDir = new File(_cacheDir, blog);
return new File(blogDir, name.substring(0, name.length()-4));
//return new File(entryFile.getParentFile(), "." + name.substring(0, name.length()-4));
}
/**
* Expensive operation, reading all entries within the blog and parsing out the tags.
* Whenever possible, query the index instead of the archive
*
*/
public List listTags(Hash blogKeyHash) {
List rv = new ArrayList();
BlogInfo info = getBlogInfo(blogKeyHash);
if (info == null)
return rv;
File blogDir = new File(_rootDir, Base64.encode(blogKeyHash.getData()));
File entries[] = blogDir.listFiles(_entryFilenameFilter);
for (int j = 0; j < entries.length; j++) {
try {
File entryDir = getEntryDir(entries[j]);
EntryContainer entry = null;
if (entryDir.exists())
entry = getCachedEntry(entryDir);
if ( (entry == null) || (!entryDir.exists()) ) {
if (!extractEntry(entries[j], entryDir, info)) {
System.err.println("Entry " + entries[j].getPath() + " is not valid");
new Exception("foo!!").printStackTrace();
continue;
}
entry = getCachedEntry(entryDir);
}
String tags[] = entry.getTags();
for (int t = 0; t < tags.length; t++) {
if (!rv.contains(tags[t])) {
System.out.println("Found a new tag in cached " + entry.getURI() + ": " + tags[t]);
rv.add(tags[t]);
}
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
} // end iterating over the entries
return rv;
}
/**
* Extract the entry to the given dir, returning true if it was verified properly
*
*/
private boolean extractEntry(File entryFile, File entryDir, BlogInfo info) throws IOException {
if (!entryDir.exists())
entryDir.mkdirs();
boolean ok = _extractor.extract(entryFile, entryDir, null, info);
if (!ok) {
File files[] = entryDir.listFiles();
for (int i = 0; i < files.length; i++)
files[i].delete();
entryDir.delete();
}
return ok;
}
private EntryContainer getCachedEntry(File entryDir) {
try {
return new CachedEntry(entryDir);
} catch (IOException ioe) {
ioe.printStackTrace();
File files[] = entryDir.listFiles();
for (int i = 0; i < files.length; i++)
files[i].delete();
entryDir.delete();
return null;
}
}
public EntryContainer getEntry(BlogURI uri) { return getEntry(uri, null); }
public EntryContainer getEntry(BlogURI uri, SessionKey blogKey) {
List entries = listEntries(uri, null, blogKey);
if (entries.size() > 0)
return (EntryContainer)entries.get(0);
else
return null;
}
public List listEntries(BlogURI uri, String tag, SessionKey blogKey) {
return listEntries(uri.getKeyHash(), uri.getEntryId(), tag, blogKey);
}
public List listEntries(Hash blog, long entryId, String tag, SessionKey blogKey) {
List rv = new ArrayList();
BlogInfo info = getBlogInfo(blog);
if (info == null)
return rv;
File blogDir = new File(_rootDir, blog.toBase64());
File entries[] = blogDir.listFiles(_entryFilenameFilter);
if (entries == null)
return rv;
for (int i = 0; i < entries.length; i++) {
try {
EntryContainer entry = null;
if (blogKey == null) {
// no key, cache.
File entryDir = getEntryDir(entries[i]);
if (entryDir.exists())
entry = getCachedEntry(entryDir);
if ((entry == null) || !entryDir.exists()) {
if (!extractEntry(entries[i], entryDir, info)) {
System.err.println("Entry " + entries[i].getPath() + " is not valid");
new Exception("foo!!!!").printStackTrace();
continue;
}
entry = getCachedEntry(entryDir);
}
} else {
// we have an explicit key - no caching
entry = new EntryContainer();
entry.load(new FileInputStream(entries[i]));
boolean ok = entry.verifySignature(_context, info);
if (!ok) {
System.err.println("Keyed entry " + entries[i].getPath() + " is not valid");
new Exception("foo!!!!!!").printStackTrace();
continue;
}
entry.parseRawData(_context, blogKey);
entry.setCompleteSize((int)entries[i].length());
}
if (entryId >= 0) {
if (entry.getURI().getEntryId() == entryId) {
rv.add(entry);
return rv;
}
} else if (tag != null) {
String tags[] = entry.getTags();
for (int j = 0; j < tags.length; j++) {
if (tags[j].equals(tag)) {
rv.add(entry);
System.out.println("cached entry matched requested tag [" + tag + "]: " + entry.getURI());
break;
}
}
} else {
System.out.println("cached entry is ok and no id or tag was requested: " + entry.getURI());
rv.add(entry);
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
return rv;
}
public boolean storeEntry(EntryContainer container) {
if (container == null) return false;
BlogURI uri = container.getURI();
if (uri == null) return false;
File blogDir = new File(_rootDir, uri.getKeyHash().toBase64());
blogDir.mkdirs();
File entryFile = new File(blogDir, getEntryFilename(uri.getEntryId()));
if (entryFile.exists()) return true;
BlogInfo info = getBlogInfo(uri);
if (info == null) {
System.out.println("no blog metadata for the uri " + uri);
return false;
}
if (!container.verifySignature(_context, info)) {
System.out.println("Not storing the invalid blog entry at " + uri);
return false;
} else {
//System.out.println("Signature is valid: " + container.getSignature() + " for info " + info);
}
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
container.write(baos, true);
byte data[] = baos.toByteArray();
FileOutputStream out = new FileOutputStream(entryFile);
out.write(data);
out.close();
container.setCompleteSize(data.length);
return true;
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
public static String getEntryFilename(long entryId) { return entryId + ".snd"; }
private static SimpleDateFormat _dateFmt = new SimpleDateFormat("yyyyMMdd", Locale.UK);
public static String getIndexName(long entryId, int numBytes) {
try {
synchronized (_dateFmt) {
String yy = _dateFmt.format(new Date(entryId));
long begin = _dateFmt.parse(yy).getTime();
long n = entryId - begin;
int kb = numBytes / 1024;
return yy + '_' + n + '_' + kb + "KB";
}
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return "UNKNOWN";
} catch (ParseException pe) {
pe.printStackTrace();
return "UNKNOWN";
}
}
public static long getEntryIdFromIndexName(String entryIndexName) {
if (entryIndexName == null) return -1;
if (entryIndexName.endsWith(".snd"))
entryIndexName = entryIndexName.substring(0, entryIndexName.length() - 4);
int endYY = entryIndexName.indexOf('_');
if (endYY <= 0) return -1;
int endN = entryIndexName.indexOf('_', endYY+1);
if (endN <= 0) return -1;
String yy = entryIndexName.substring(0, endYY);
String n = entryIndexName.substring(endYY+1, endN);
try {
synchronized (_dateFmt) {
long dayBegin = _dateFmt.parse(yy).getTime();
long dayEntry = Long.parseLong(n);
return dayBegin + dayEntry;
}
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
} catch (ParseException pe) {
pe.printStackTrace();
}
return -1;
}
public static int getSizeFromIndexName(String entryIndexName) {
if (entryIndexName == null) return -1;
if (entryIndexName.endsWith(".snd"))
entryIndexName = entryIndexName.substring(0, entryIndexName.length() - 4);
int beginSize = entryIndexName.lastIndexOf('_');
if ( (beginSize <= 0) || (beginSize >= entryIndexName.length()-3) )
return -1;
try {
String sz = entryIndexName.substring(beginSize+1, entryIndexName.length()-2);
return Integer.parseInt(sz);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
}
return -1;
}
public ArchiveIndex getIndex() {
if (_index == null)
regenerateIndex();
return _index;
}
public File getArchiveDir() { return _rootDir; }
public File getIndexFile() { return new File(_rootDir, INDEX_FILE); }
public void regenerateIndex() {
reloadInfo();
_index = ArchiveIndexer.index(_context, this);
try {
FileOutputStream out = new FileOutputStream(new File(_rootDir, INDEX_FILE));
out.write(DataHelper.getUTF8(_index.toString()));
out.flush();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}

View File

@@ -0,0 +1,190 @@
package net.i2p.syndie;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
* Dig through the archive to build an index
*/
class ArchiveIndexer {
private static final int RECENT_BLOG_COUNT = 10;
private static final int RECENT_ENTRY_COUNT = 10;
public static ArchiveIndex index(I2PAppContext ctx, Archive source) {
LocalArchiveIndex rv = new LocalArchiveIndex();
rv.setGeneratedOn(ctx.clock().now());
File rootDir = source.getArchiveDir();
File headerFile = new File(rootDir, Archive.HEADER_FILE);
if (headerFile.exists()) {
try {
BufferedReader in = new BufferedReader(new InputStreamReader(new FileInputStream(headerFile), "UTF-8"));
String line = null;
while ( (line = in.readLine()) != null) {
StringTokenizer tok = new StringTokenizer(line, ":");
if (tok.countTokens() == 2)
rv.setHeader(tok.nextToken(), tok.nextToken());
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
// things are new if we just received them in the last day
long newSince = ctx.clock().now() - 24*60*60*1000;
rv.setVersion(Version.INDEX_VERSION);
/** 0-lowestEntryId --> blog Hash */
Map blogsByAge = new TreeMap();
/** 0-entryId --> BlogURI */
Map entriesByAge = new TreeMap();
List blogs = source.listBlogs();
rv.setAllBlogs(blogs.size());
int newEntries = 0;
int allEntries = 0;
long newSize = 0;
long totalSize = 0;
int newBlogs = 0;
SMLParser parser = new SMLParser();
for (int i = 0; i < blogs.size(); i++) {
BlogInfo cur = (BlogInfo)blogs.get(i);
Hash key = cur.getKey().calculateHash();
String keyStr = Base64.encode(key.getData());
File blogDir = new File(rootDir, Base64.encode(key.getData()));
File metaFile = new File(blogDir, Archive.METADATA_FILE);
long metadate = metaFile.lastModified();
List entries = source.listEntries(key, -1, null, null);
System.out.println("Entries under " + key + ": " + entries);
/** tag name --> ordered map of entryId to EntryContainer */
Map tags = new TreeMap();
for (int j = 0; j < entries.size(); j++) {
EntryContainer entry = (EntryContainer)entries.get(j);
entriesByAge.put(new Long(0-entry.getURI().getEntryId()), entry.getURI());
allEntries++;
totalSize += entry.getCompleteSize();
String entryTags[] = entry.getTags();
for (int t = 0; t < entryTags.length; t++) {
if (!tags.containsKey(entryTags[t])) {
tags.put(entryTags[t], new TreeMap());
//System.err.println("New tag [" + entryTags[t] + "]");
}
Map entriesByTag = (Map)tags.get(entryTags[t]);
entriesByTag.put(new Long(0-entry.getURI().getEntryId()), entry);
System.out.println("Entries under tag " + entryTags[t] + ":" + entriesByTag.values());
}
if (entry.getURI().getEntryId() >= newSince) {
newEntries++;
newSize += entry.getCompleteSize();
}
HeaderReceiver rec = new HeaderReceiver();
parser.parse(entry.getEntry().getText(), rec);
String reply = rec.getHeader(HTMLRenderer.HEADER_IN_REPLY_TO);
if (reply != null) {
BlogURI parent = new BlogURI(reply.trim());
if ( (parent.getKeyHash() != null) && (parent.getEntryId() >= 0) )
rv.addReply(parent, entry.getURI());
else
System.err.println("Parent of " + entry.getURI() + " is not valid: [" + reply.trim() + "]");
}
}
long lowestEntryId = -1;
for (Iterator iter = tags.keySet().iterator(); iter.hasNext(); ) {
String tagName = (String)iter.next();
Map tagEntries = (Map)tags.get(tagName);
long highestId = -1;
if (tagEntries.size() <= 0) break;
Long id = (Long)tagEntries.keySet().iterator().next();
highestId = 0 - id.longValue();
rv.addBlog(key, tagName, highestId);
for (Iterator entryIter = tagEntries.values().iterator(); entryIter.hasNext(); ) {
EntryContainer entry = (EntryContainer)entryIter.next();
String indexName = Archive.getIndexName(entry.getURI().getEntryId(), entry.getCompleteSize());
rv.addBlogEntry(key, tagName, indexName);
if (!entryIter.hasNext())
lowestEntryId = entry.getURI().getEntryId();
}
}
if (lowestEntryId > newSince)
newBlogs++;
blogsByAge.put(new Long(0-lowestEntryId), key);
}
rv.setAllEntries(allEntries);
rv.setNewBlogs(newBlogs);
rv.setNewEntries(newEntries);
rv.setTotalSize(totalSize);
rv.setNewSize(newSize);
int i = 0;
for (Iterator iter = blogsByAge.keySet().iterator(); iter.hasNext() && i < RECENT_BLOG_COUNT; i++) {
Long when = (Long)iter.next();
Hash key = (Hash)blogsByAge.get(when);
rv.addNewestBlog(key);
}
i = 0;
for (Iterator iter = entriesByAge.keySet().iterator(); iter.hasNext() && i < RECENT_ENTRY_COUNT; i++) {
Long when = (Long)iter.next();
BlogURI uri = (BlogURI)entriesByAge.get(when);
rv.addNewestEntry(uri);
}
return rv;
}
private static class HeaderReceiver implements SMLParser.EventReceiver {
private Properties _headers;
public HeaderReceiver() { _headers = null; }
public String getHeader(String name) { return (_headers != null ? _headers.getProperty(name) : null); }
public void receiveHeader(String header, String value) {
if (_headers == null) _headers = new Properties();
_headers.setProperty(header, value);
}
public void receiveAddress(String name, String schema, String location, String anchorText) {}
public void receiveArchive(String name, String description, String locationSchema, String location, String postingKey, String anchorText) {}
public void receiveAttachment(int id, String anchorText) {}
public void receiveBegin() {}
public void receiveBlog(String name, String blogKeyHash, String blogPath, long blogEntryId, List blogArchiveLocations, String anchorText) {}
public void receiveBold(String text) {}
public void receiveCode(String text, String codeLocationSchema, String codeLocation) {}
public void receiveCut(String summaryText) {}
public void receiveEnd() {}
public void receiveGT() {}
public void receiveH1(String text) {}
public void receiveH2(String text) {}
public void receiveH3(String text) {}
public void receiveH4(String text) {}
public void receiveH5(String text) {}
public void receiveHR() {}
public void receiveHeaderEnd() {}
public void receiveImage(String alternateText, int attachmentId) {}
public void receiveItalic(String text) {}
public void receiveLT() {}
public void receiveLeftBracket() {}
public void receiveLink(String schema, String location, String text) {}
public void receiveNewline() {}
public void receivePlain(String text) {}
public void receivePre(String text) {}
public void receiveQuote(String text, String whoQuoted, String quoteLocationSchema, String quoteLocation) {}
public void receiveRightBracket() {}
public void receiveUnderline(String text) {}
}
}

View File

@@ -0,0 +1,485 @@
package net.i2p.syndie;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
*
*/
public class BlogManager {
private I2PAppContext _context;
private static BlogManager _instance;
private File _blogKeyDir;
private File _privKeyDir;
private File _archiveDir;
private File _userDir;
private File _cacheDir;
private File _tempDir;
private File _rootDir;
private Archive _archive;
static {
TimeZone.setDefault(TimeZone.getTimeZone("GMT"));
String rootDir = I2PAppContext.getGlobalContext().getProperty("syndie.rootDir");
if (false) {
if (rootDir == null)
rootDir = System.getProperty("user.home");
rootDir = rootDir + File.separatorChar + ".syndie";
} else {
if (rootDir == null)
rootDir = "./syndie";
}
_instance = new BlogManager(I2PAppContext.getGlobalContext(), rootDir);
}
public static BlogManager instance() { return _instance; }
public BlogManager(I2PAppContext ctx, String rootDir) {
_context = ctx;
_rootDir = new File(rootDir);
_rootDir.mkdirs();
readConfig();
_blogKeyDir = new File(_rootDir, "blogkeys");
_privKeyDir = new File(_rootDir, "privkeys");
String archiveDir = _context.getProperty("syndie.archiveDir");
if (archiveDir != null)
_archiveDir = new File(archiveDir);
else
_archiveDir = new File(_rootDir, "archive");
_userDir = new File(_rootDir, "users");
_cacheDir = new File(_rootDir, "cache");
_tempDir = new File(_rootDir, "temp");
_blogKeyDir.mkdirs();
_privKeyDir.mkdirs();
_archiveDir.mkdirs();
_cacheDir.mkdirs();
_userDir.mkdirs();
_tempDir.mkdirs();
_archive = new Archive(ctx, _archiveDir.getAbsolutePath(), _cacheDir.getAbsolutePath());
_archive.regenerateIndex();
}
private void readConfig() {
File config = new File(_rootDir, "syndie.config");
if (config.exists()) {
try {
Properties p = new Properties();
DataHelper.loadProps(p, config);
for (Iterator iter = p.keySet().iterator(); iter.hasNext(); ) {
String key = (String)iter.next();
System.setProperty(key, p.getProperty(key));
}
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
}
public void writeConfig() {
File config = new File(_rootDir, "syndie.config");
FileOutputStream out = null;
try {
out = new FileOutputStream(config);
for (Iterator iter = _context.getPropertyNames().iterator(); iter.hasNext(); ) {
String name = (String)iter.next();
if (name.startsWith("syndie."))
out.write(DataHelper.getUTF8(name + '=' + _context.getProperty(name) + '\n'));
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
}
public BlogInfo createBlog(String name, String description, String contactURL, String archives[]) {
return createBlog(name, null, description, contactURL, archives);
}
public BlogInfo createBlog(String name, SigningPublicKey posters[], String description, String contactURL, String archives[]) {
Object keys[] = _context.keyGenerator().generateSigningKeypair();
SigningPublicKey pub = (SigningPublicKey)keys[0];
SigningPrivateKey priv = (SigningPrivateKey)keys[1];
try {
FileOutputStream out = new FileOutputStream(new File(_privKeyDir, Base64.encode(pub.calculateHash().getData()) + ".priv"));
pub.writeBytes(out);
priv.writeBytes(out);
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return null;
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
}
return createInfo(pub, priv, name, posters, description, contactURL, archives, 0);
}
public BlogInfo createInfo(SigningPublicKey pub, SigningPrivateKey priv, String name, SigningPublicKey posters[],
String description, String contactURL, String archives[], int edition) {
Properties opts = new Properties();
opts.setProperty("Name", name);
opts.setProperty("Description", description);
opts.setProperty("Edition", Integer.toString(edition));
opts.setProperty("ContactURL", contactURL);
for (int i = 0; archives != null && i < archives.length; i++)
opts.setProperty("Archive." + i, archives[i]);
BlogInfo info = new BlogInfo(pub, posters, opts);
info.sign(_context, priv);
_archive.storeBlogInfo(info);
return info;
}
public Archive getArchive() { return _archive; }
public File getTempDir() { return _tempDir; }
public List listMyBlogs() {
File files[] = _privKeyDir.listFiles();
List rv = new ArrayList();
for (int i = 0; i < files.length; i++) {
if (files[i].isFile() && !files[i].isHidden()) {
try {
SigningPublicKey pub = new SigningPublicKey();
pub.readBytes(new FileInputStream(files[i]));
BlogInfo info = _archive.getBlogInfo(pub.calculateHash());
if (info != null)
rv.add(info);
} catch (IOException ioe) {
ioe.printStackTrace();
} catch (DataFormatException dfe) {
dfe.printStackTrace();
}
}
}
return rv;
}
public SigningPrivateKey getMyPrivateKey(BlogInfo blog) {
if (blog == null) return null;
File keyFile = new File(_privKeyDir, Base64.encode(blog.getKey().calculateHash().getData()) + ".priv");
try {
FileInputStream in = new FileInputStream(keyFile);
SigningPublicKey pub = new SigningPublicKey();
pub.readBytes(in);
SigningPrivateKey priv = new SigningPrivateKey();
priv.readBytes(in);
return priv;
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return null;
}
}
public String login(User user, String login, String pass) {
Hash userHash = _context.sha().calculateHash(DataHelper.getUTF8(login));
Hash passHash = _context.sha().calculateHash(DataHelper.getUTF8(pass));
File userFile = new File(_userDir, Base64.encode(userHash.getData()));
System.out.println("Attempting to login to " + login + " w/ pass = " + pass
+ ": file = " + userFile.getAbsolutePath() + " passHash = "
+ Base64.encode(passHash.getData()));
if (userFile.exists()) {
try {
Properties props = new Properties();
FileInputStream fin = new FileInputStream(userFile);
BufferedReader in = new BufferedReader(new InputStreamReader(fin, "UTF-8"));
String line = null;
while ( (line = in.readLine()) != null) {
int split = line.indexOf('=');
if (split <= 0) continue;
String key = line.substring(0, split);
String val = line.substring(split+1);
props.setProperty(key.trim(), val.trim());
}
return user.login(login, pass, props);
} catch (IOException ioe) {
ioe.printStackTrace();
return "Error logging in - corrupt userfile";
}
} else {
return "User does not exist";
}
}
/** hash of the password required to register and create a new blog (null means no password required) */
public String getRegistrationPassword() {
String pass = _context.getProperty("syndie.registrationPassword");
if ( (pass == null) || (pass.trim().length() <= 0) ) return null;
return pass;
}
public void saveUser(User user) {
if (!user.getAuthenticated()) return;
String userHash = Base64.encode(_context.sha().calculateHash(DataHelper.getUTF8(user.getUsername())).getData());
File userFile = new File(_userDir, userHash);
FileOutputStream out = null;
try {
out = new FileOutputStream(userFile);
out.write(DataHelper.getUTF8(user.export()));
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe){}
}
}
public String register(User user, String login, String password, String registrationPassword, String blogName, String blogDescription, String contactURL) {
System.err.println("Register [" + login + "] pass [" + password + "] name [" + blogName + "] descr [" + blogDescription + "] contact [" + contactURL + "]");
System.err.println("reference bad string: [" + EncodingTestGenerator.TEST_STRING + "]");
String hashedRegistrationPassword = getRegistrationPassword();
if (hashedRegistrationPassword != null) {
try {
if (!hashedRegistrationPassword.equals(Base64.encode(_context.sha().calculateHash(registrationPassword.getBytes("UTF-8")).getData())))
return "Invalid registration password";
} catch (UnsupportedEncodingException uee) {
return "Error registering";
}
}
String userHash = Base64.encode(_context.sha().calculateHash(DataHelper.getUTF8(login)).getData());
File userFile = new File(_userDir, userHash);
if (userFile.exists()) {
return "Cannot register the login " + login + ": it already exists";
} else {
BlogInfo info = createBlog(blogName, blogDescription, contactURL, null);
String hashedPassword = Base64.encode(_context.sha().calculateHash(DataHelper.getUTF8(password)).getData());
FileOutputStream out = null;
try {
out = new FileOutputStream(userFile);
BufferedWriter bw = new BufferedWriter(new OutputStreamWriter(out, "UTF-8"));
bw.write("password=" + hashedPassword + "\n");
bw.write("blog=" + Base64.encode(info.getKey().calculateHash().getData()) + "\n");
bw.write("lastid=-1\n");
bw.write("lastmetaedition=0\n");
bw.write("addressbook=userhosts-"+userHash + ".txt\n");
bw.write("showimages=false\n");
bw.write("showexpanded=false\n");
bw.flush();
} catch (IOException ioe) {
ioe.printStackTrace();
return "Internal error registering - " + ioe.getMessage();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
String loginResult = login(user, login, password);
_archive.regenerateIndex();
return loginResult;
}
}
public BlogURI createBlogEntry(User user, String subject, String tags, String entryHeaders, String sml) {
return createBlogEntry(user, subject, tags, entryHeaders, sml, null, null, null);
}
public BlogURI createBlogEntry(User user, String subject, String tags, String entryHeaders, String sml, List fileNames, List fileStreams, List fileTypes) {
if (!user.getAuthenticated()) return null;
BlogInfo info = getArchive().getBlogInfo(user.getBlog());
if (info == null) return null;
SigningPrivateKey privkey = getMyPrivateKey(info);
if (privkey == null) return null;
long entryId = -1;
long now = _context.clock().now();
long dayBegin = getDayBegin(now);
if (user.getMostRecentEntry() >= dayBegin)
entryId = user.getMostRecentEntry() + 1;
else
entryId = dayBegin;
StringTokenizer tok = new StringTokenizer(tags, " ,\n\t");
String tagList[] = new String[tok.countTokens()];
for (int i = 0; i < tagList.length; i++)
tagList[i] = tok.nextToken().trim();
BlogURI uri = new BlogURI(user.getBlog(), entryId);
try {
StringBuffer raw = new StringBuffer(sml.length() + 128);
raw.append("Subject: ").append(subject).append('\n');
raw.append("Tags: ");
for (int i = 0; i < tagList.length; i++)
raw.append(tagList[i]).append('\t');
raw.append('\n');
if ( (entryHeaders != null) && (entryHeaders.trim().length() > 0) ) {
System.out.println("Entry headers: " + entryHeaders);
BufferedReader userHeaders = new BufferedReader(new InputStreamReader(new ByteArrayInputStream(DataHelper.getUTF8(entryHeaders)), "UTF-8"));
String line = null;
while ( (line = userHeaders.readLine()) != null) {
line = line.trim();
System.out.println("Line: " + line);
if (line.length() <= 0) continue;
int split = line.indexOf('=');
int split2 = line.indexOf(':');
if ( (split < 0) || ( (split2 > 0) && (split2 < split) ) ) split = split2;
String key = line.substring(0,split).trim();
String val = line.substring(split+1).trim();
raw.append(key).append(": ").append(val).append('\n');
}
}
raw.append('\n');
raw.append(sml);
EntryContainer c = new EntryContainer(uri, tagList, DataHelper.getUTF8(raw));
if ((fileNames != null) && (fileStreams != null) && (fileNames.size() == fileStreams.size()) ) {
for (int i = 0; i < fileNames.size(); i++) {
String name = (String)fileNames.get(i);
InputStream in = (InputStream)fileStreams.get(i);
String fileType = (fileTypes != null ? (String)fileTypes.get(i) : "application/octet-stream");
ByteArrayOutputStream baos = new ByteArrayOutputStream(1024);
byte buf[] = new byte[1024];
while (true) {
int read = in.read(buf);
if (read == -1) break;
baos.write(buf, 0, read);
}
byte att[] = baos.toByteArray();
if ( (att != null) && (att.length > 0) )
c.addAttachment(att, new File(name).getName(), null, fileType);
}
}
//for (int i = 7; i < args.length; i++) {
// c.addAttachment(read(args[i]), new File(args[i]).getName(),
// "Attached file", "application/octet-stream");
//}
SessionKey entryKey = null;
//if (!"NONE".equals(args[5]))
// entryKey = new SessionKey(Base64.decode(args[5]));
c.seal(_context, privkey, null);
boolean ok = getArchive().storeEntry(c);
if (ok) {
getArchive().regenerateIndex();
user.setMostRecentEntry(entryId);
saveUser(user);
return uri;
} else {
return null;
}
} catch (IOException ioe) {
ioe.printStackTrace();
return null;
}
}
/**
* read in the syndie blog metadata file from the stream, verifying it and adding it to
* the archive if necessary
*
*/
public boolean importBlogMetadata(InputStream metadataStream) throws IOException {
try {
BlogInfo info = new BlogInfo();
info.load(metadataStream);
return _archive.storeBlogInfo(info);
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
/**
* read in the syndie entry file from the stream, verifying it and adding it to
* the archive if necessary
*
*/
public boolean importBlogEntry(InputStream entryStream) throws IOException {
try {
EntryContainer c = new EntryContainer();
c.load(entryStream);
return _archive.storeEntry(c);
} catch (IOException ioe) {
ioe.printStackTrace();
return false;
}
}
public String addAddress(User user, String name, String location, String schema) {
if (!user.getAuthenticated()) return "Not logged in";
boolean ok = validateAddressName(name);
if (!ok) return "Invalid name: " + HTMLRenderer.sanitizeString(name);
ok = validateAddressLocation(location);
if (!ok) return "Invalid location: " + HTMLRenderer.sanitizeString(location);
if (!validateAddressSchema(schema)) return "Unsupported schema: " + HTMLRenderer.sanitizeString(schema);
// no need to quote user/location further, as they've been sanitized
FileOutputStream out = null;
try {
File userHostsFile = new File(user.getAddressbookLocation());
Properties knownHosts = getKnownHosts(user, true);
if (knownHosts.containsKey(name)) return "Name is already in use";
out = new FileOutputStream(userHostsFile, true);
out.write(DataHelper.getUTF8(name + "=" + location + '\n'));
return "Address " + name + " written to your hosts file (" + userHostsFile.getName() + ")";
} catch (IOException ioe) {
return "Error writing out host entry: " + ioe.getMessage();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
}
public Properties getKnownHosts(User user, boolean includePublic) throws IOException {
Properties rv = new Properties();
if ( (user != null) && (user.getAuthenticated()) ) {
File userHostsFile = new File(user.getAddressbookLocation());
rv.putAll(getKnownHosts(userHostsFile));
}
if (includePublic) {
rv.putAll(getKnownHosts(new File("hosts.txt")));
}
return rv;
}
private Properties getKnownHosts(File filename) throws IOException {
Properties rv = new Properties();
if (filename.exists()) {
rv.load(new FileInputStream(filename));
}
return rv;
}
private boolean validateAddressName(String name) {
if ( (name == null) || (name.trim().length() <= 0) || (!name.endsWith(".i2p")) ) return false;
for (int i = 0; i < name.length(); i++) {
char c = name.charAt(i);
if (!Character.isLetterOrDigit(c) && ('.' != c) && ('-' != c) && ('_' != c) )
return false;
}
return true;
}
private boolean validateAddressLocation(String location) {
if ( (location == null) || (location.trim().length() <= 0) ) return false;
try {
Destination d = new Destination(location);
return (d.getPublicKey() != null);
} catch (DataFormatException dfe) {
dfe.printStackTrace();
return false;
}
}
private boolean validateAddressSchema(String schema) {
if ( (schema == null) || (schema.trim().length() <= 0) ) return false;
return "eep".equals(schema) || "i2p".equals(schema);
}
private final SimpleDateFormat _dateFormat = new SimpleDateFormat("yyyy/MM/dd", Locale.UK);
private final long getDayBegin(long now) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(now));
return _dateFormat.parse(str).getTime();
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return -1;
}
}
}
}

View File

@@ -0,0 +1,188 @@
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
*/
public class CLI {
public static final String USAGE = "Usage: \n" +
"rootDir regenerateIndex\n" +
"rootDir createBlog name description contactURL[ archiveURL]*\n" +
"rootDir createEntry blogPublicKeyHash tag[,tag]* (NOW|entryId) (NONE|entryKeyBase64) smlFile[ attachmentFile]*\n" +
"rootDir listMyBlogs\n" +
"rootDir listTags blogPublicKeyHash\n" +
"rootDir listEntries blogPublicKeyHash blogTag\n" +
"rootDir renderEntry blogPublicKeyHash entryId (NONE|entryKeyBase64) summaryOnly includeImages\n";
public static void main(String args[]) {
//args = new String[] { "~/.syndie/", "listEntries", "9qXCJUyUBCCaiIShURo02ckxjrMvrtiDYENv2ATL3-Y=", "/" };
//args = new String[] { "~/.syndie/", "renderEntry", "Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=", "/", "20050811001", "NONE", "true", "false" };
if (args.length < 2) {
System.err.print(USAGE);
return;
}
String command = args[1];
if ("createBlog".equals(command))
createBlog(args);
else if ("listMyBlogs".equals(command))
listMyBlogs(args);
else if ("createEntry".equals(command))
createEntry(args);
else if ("listTags".equals(command))
listPaths(args);
else if ("listEntries".equals(command))
listEntries(args);
else if ("regenerateIndex".equals(command))
regenerateIndex(args);
else if ("renderEntry".equals(command))
renderEntry(args);
else
System.out.print(USAGE);
}
private static void createBlog(String args[]) {
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
String archives[] = new String[args.length - 5];
System.arraycopy(args, 5, archives, 0, archives.length);
BlogInfo info = mgr.createBlog(args[2], args[3], args[4], archives);
System.out.println("Blog created: " + info);
mgr.getArchive().regenerateIndex();
}
private static void listMyBlogs(String args[]) {
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List info = mgr.listMyBlogs();
for (int i = 0; i < info.size(); i++)
System.out.println(info.get(i).toString());
}
private static void listPaths(String args[]) {
// "rootDir listTags blogPublicKeyHash\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List tags = mgr.getArchive().listTags(new Hash(Base64.decode(args[2])));
System.out.println("tag count: " + tags.size());
for (int i = 0; i < tags.size(); i++)
System.out.println("Tag " + i + ": " + tags.get(i).toString());
}
private static void regenerateIndex(String args[]) {
// "rootDir regenerateIndex\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
mgr.getArchive().regenerateIndex();
System.out.println("Index regenerated");
}
private static void listEntries(String args[]) {
// "rootDir listEntries blogPublicKeyHash tag\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
List entries = mgr.getArchive().listEntries(new Hash(Base64.decode(args[2])), -1, args[3], null);
System.out.println("Entry count: " + entries.size());
for (int i = 0; i < entries.size(); i++) {
EntryContainer entry = (EntryContainer)entries.get(i);
System.out.println("***************************************************");
System.out.println("Entry " + i + ": " + entry.getURI().toString());
System.out.println("===================================================");
System.out.println(entry.getEntry().getText());
System.out.println("===================================================");
Attachment attachments[] = entry.getAttachments();
for (int j = 0; j < attachments.length; j++) {
System.out.println("Attachment " + j + ": " + attachments[j]);
}
System.out.println("===================================================");
}
}
private static void renderEntry(String args[]) {
//"rootDir renderEntry blogPublicKeyHash entryId (NONE|entryKeyBase64) summaryOnly includeImages\n";
BlogManager mgr = new BlogManager(I2PAppContext.getGlobalContext(), args[0]);
long id = -1;
try {
id = Long.parseLong(args[3]);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return;
}
SessionKey entryKey = null;
if (!("NONE".equals(args[4])))
entryKey = new SessionKey(Base64.decode(args[5]));
EntryContainer entry = mgr.getArchive().getEntry(new BlogURI(new Hash(Base64.decode(args[2])), id), entryKey);
if (entry != null) {
HTMLRenderer renderer = new HTMLRenderer();
boolean summaryOnly = "true".equalsIgnoreCase(args[5]);
boolean showImages = "true".equalsIgnoreCase(args[6]);
try {
File f = File.createTempFile("syndie", ".html");
Writer out = new OutputStreamWriter(new FileOutputStream(f), "UTF-8");
renderer.render(null, mgr.getArchive(), entry, out, summaryOnly, showImages);
out.flush();
out.close();
System.out.println("Rendered to " + f.getAbsolutePath() + ": " + f.length());
} catch (IOException ioe) {
ioe.printStackTrace();
}
} else {
System.err.println("Entry does not exist");
}
}
private static void createEntry(String args[]) {
// "rootDir createEntry blogPublicKey tag[,tag]* (NOW|entryId) (NONE|entryKeyBase64) smlFile[ attachmentFile]*\n" +
I2PAppContext ctx = I2PAppContext.getGlobalContext();
BlogManager mgr = new BlogManager(ctx, args[0]);
long entryId = -1;
if ("NOW".equals(args[4])) {
entryId = ctx.clock().now();
} else {
try {
entryId = Long.parseLong(args[4]);
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return;
}
}
StringTokenizer tok = new StringTokenizer(args[3], ",");
String tags[] = new String[tok.countTokens()];
for (int i = 0; i < tags.length; i++)
tags[i] = tok.nextToken();
BlogURI uri = new BlogURI(new Hash(Base64.decode(args[2])), entryId);
BlogInfo blog = mgr.getArchive().getBlogInfo(uri);
if (blog == null) {
System.err.println("Blog does not exist: " + uri);
return;
}
SigningPrivateKey key = mgr.getMyPrivateKey(blog);
try {
byte smlData[] = read(args[6]);
EntryContainer c = new EntryContainer(uri, tags, smlData);
for (int i = 7; i < args.length; i++) {
c.addAttachment(read(args[i]), new File(args[i]).getName(),
"Attached file", "application/octet-stream");
}
SessionKey entryKey = null;
if (!"NONE".equals(args[5]))
entryKey = new SessionKey(Base64.decode(args[5]));
c.seal(ctx, key, entryKey);
boolean ok = mgr.getArchive().storeEntry(c);
System.out.println("Blog entry created: " + c+ "? " + ok);
if (ok)
mgr.getArchive().regenerateIndex();
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
private static final byte[] read(String file) throws IOException {
File f = new File(file);
FileInputStream in = new FileInputStream(f);
byte rv[] = new byte[(int)f.length()];
if (rv.length != DataHelper.read(in, rv))
throw new IOException("File not read completely");
return rv;
}
}

View File

@@ -0,0 +1,237 @@
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
/**
* Lazy loading wrapper for an entry, pulling data out of a cached & extracted dir,
* rather than dealing with the crypto, zip, etc.
*
*/
class CachedEntry extends EntryContainer {
private File _entryDir;
private int _format;
private int _size;
private BlogURI _blog;
private Properties _headers;
private Entry _entry;
private Attachment _attachments[];
public CachedEntry(File entryDir) throws IOException {
_entryDir = entryDir;
importMeta();
_entry = new CachedEntryDetails();
_attachments = null;
}
// always available, loaded from meta
public int getFormat() { return _format; }
public BlogURI getURI() { return _blog; }
public int getCompleteSize() { return _size; }
// dont need to override it, as it works off getHeader
//public String[] getTags() { return super.getTags(); }
public Entry getEntry() { return _entry; }
public Attachment[] getAttachments() {
importAttachments();
return _attachments;
}
public String getHeader(String key) {
importHeaders();
return _headers.getProperty(key);
}
public String toString() { return getURI().toString(); }
public boolean verifySignature(I2PAppContext ctx, BlogInfo info) { return true; }
// not supported...
public void parseRawData(I2PAppContext ctx) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public void parseRawData(I2PAppContext ctx, SessionKey zipKey) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public void setHeader(String name, String val) {
throw new IllegalStateException("Not supported on cached entries");
}
public void addAttachment(byte data[], String name, String description, String mimeType) {
throw new IllegalStateException("Not supported on cached entries");
}
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
throw new IllegalStateException("Not supported on cached entries");
}
public Signature getSignature() {
throw new IllegalStateException("Not supported on cached entries");
}
// now the actual lazy loading code
private void importMeta() {
Properties meta = readProps(new File(_entryDir, EntryExtractor.META));
_format = getInt(meta, "format");
_size = getInt(meta, "size");
_blog = new BlogURI(new Hash(Base64.decode(meta.getProperty("blog"))), getLong(meta, "entry"));
}
private Properties importHeaders() {
if (_headers == null)
_headers = readProps(new File(_entryDir, EntryExtractor.HEADERS));
return _headers;
}
private void importAttachments() {
if (_attachments == null) {
List attachments = new ArrayList();
int i = 0;
while (true) {
File meta = new File(_entryDir, EntryExtractor.ATTACHMENT_PREFIX + i + EntryExtractor.ATTACHMENT_META_SUFFIX);
if (meta.exists())
attachments.add(new CachedAttachment(i, meta));
else
break;
i++;
}
Attachment a[] = new Attachment[attachments.size()];
for (i = 0; i < a.length; i++)
a[i] = (Attachment)attachments.get(i);
_attachments = a;
}
return;
}
private static Properties readProps(File propsFile) {
Properties rv = new Properties();
BufferedReader in = null;
try {
in = new BufferedReader(new InputStreamReader(new FileInputStream(propsFile), "UTF-8"));
String line = null;
while ( (line = in.readLine()) != null) {
int split = line.indexOf('=');
if ( (split <= 0) || (split >= line.length()) ) continue;
rv.setProperty(line.substring(0, split).trim(), line.substring(split+1).trim());
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
}
return rv;
}
private static final int getInt(Properties props, String key) {
String val = props.getProperty(key);
try { return Integer.parseInt(val); } catch (NumberFormatException nfe) {}
return -1;
}
private static final long getLong(Properties props, String key) {
String val = props.getProperty(key);
try { return Long.parseLong(val); } catch (NumberFormatException nfe) {}
return -1l;
}
private class CachedEntryDetails extends Entry {
private String _text;
public CachedEntryDetails() {
super(null);
}
public String getText() {
importText();
return _text;
}
private void importText() {
if (_text == null) {
InputStream in = null;
try {
File f = new File(_entryDir, EntryExtractor.ENTRY);
byte buf[] = new byte[(int)f.length()]; // hmm
in = new FileInputStream(f);
int read = DataHelper.read(in, buf);
if (read != buf.length) throw new IOException("read: " + read + " file size: " + buf.length + " for " + f.getPath());
_text = DataHelper.getUTF8(buf);
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
}
}
}
}
private class CachedAttachment extends Attachment {
private int _attachmentId;
private File _metaFile;
private Properties _attachmentHeaders;
private int _dataSize;
public CachedAttachment(int id, File meta) {
super(null, null);
_attachmentId = id;
_metaFile = meta;
_attachmentHeaders = null;
}
public int getDataLength() {
importAttachmentHeaders();
return _dataSize;
}
public byte[] getData() {
throw new IllegalStateException("Not supported on cached entries");
}
public InputStream getDataStream() throws IOException {
String name = EntryExtractor.ATTACHMENT_PREFIX + _attachmentId + EntryExtractor.ATTACHMENT_DATA_SUFFIX;
File f = new File(_entryDir, name);
return new FileInputStream(f);
}
public byte[] getRawMetadata() {
throw new IllegalStateException("Not supported on cached entries");
}
public String getMeta(String key) {
importAttachmentHeaders();
return _attachmentHeaders.getProperty(key);
}
//public String getName() { return getMeta(NAME); }
//public String getDescription() { return getMeta(DESCRIPTION); }
//public String getMimeType() { return getMeta(MIMETYPE); }
public void setMeta(String key, String val) {
throw new IllegalStateException("Not supported on cached entries");
}
public Map getMeta() {
importAttachmentHeaders();
return _attachmentHeaders;
}
public String toString() {
importAttachmentHeaders();
int len = _dataSize;
return getName()
+ (getDescription() != null ? ": " + getDescription() : "")
+ (getMimeType() != null ? ", type: " + getMimeType() : "")
+ ", size: " + len;
}
private void importAttachmentHeaders() {
if (_attachmentHeaders == null) {
Properties props = readProps(_metaFile);
String sz = (String)props.remove(EntryExtractor.ATTACHMENT_DATA_SIZE);
if (sz != null) {
try {
_dataSize = Integer.parseInt(sz);
} catch (NumberFormatException nfe) {}
}
_attachmentHeaders = props;
}
}
}
}

View File

@@ -0,0 +1,132 @@
package net.i2p.syndie;
import java.io.*;
import java.util.*;
import java.util.zip.*;
import net.i2p.data.*;
import net.i2p.syndie.data.*;
import net.i2p.I2PAppContext;
/**
* To cut down on unnecessary IO/cpu load, extract entries onto the disk for
* faster access later. Individual entries are stored in subdirectories based on
* their name - $archiveDir/$blogDir/$entryId.snd extracts its files into various
* files under $cacheDir/$blogDir/$entryId/:
* headers.txt: name=value pairs for attributes of the entry container itself
* info.txt: name=value pairs for implicit attributes of the container (blog, id, format, size)
* entry.sml: raw sml file
* attachmentN_data.dat: raw binary data for attachment N
* attachmentN_meta.dat: name=value pairs for attributes of attachment N
*
*/
public class EntryExtractor {
private I2PAppContext _context;
static final String HEADERS = "headers.txt";
static final String META = "meta.txt";
static final String ENTRY = "entry.sml";
static final String ATTACHMENT_PREFIX = "attachment";
static final String ATTACHMENT_DATA_SUFFIX = "_data.dat";
static final String ATTACHMENT_META_SUFFIX = "_meta.txt";
static final String ATTACHMENT_DATA_SIZE = "EntryExtractor__dataSize";
public EntryExtractor(I2PAppContext context) {
_context = context;
}
public boolean extract(File entryFile, File entryDir, SessionKey entryKey, BlogInfo info) throws IOException {
EntryContainer entry = new EntryContainer();
entry.load(new FileInputStream(entryFile));
boolean ok = entry.verifySignature(_context, info);
if (!ok) {
return false;
} else {
entry.setCompleteSize((int)entryFile.length());
if (entryKey != null)
entry.parseRawData(_context, entryKey);
else
entry.parseRawData(_context);
extract(entry, entryDir);
return true;
}
}
public void extract(EntryContainer entry, File entryDir) throws IOException {
extractHeaders(entry, entryDir);
extractMeta(entry, entryDir);
extractEntry(entry, entryDir);
Attachment attachments[] = entry.getAttachments();
if (attachments != null) {
for (int i = 0; i < attachments.length; i++) {
extractAttachmentData(i, attachments[i], entryDir);
extractAttachmentMetadata(i, attachments[i], entryDir);
}
}
}
private void extractHeaders(EntryContainer entry, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, HEADERS));
Map headers = entry.getHeaders();
for (Iterator iter = headers.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = (String)headers.get(k);
out.write(DataHelper.getUTF8(k.trim() + '=' + v.trim() + '\n'));
}
} finally {
out.close();
}
}
private void extractMeta(EntryContainer entry, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, META));
out.write(DataHelper.getUTF8("format=" + entry.getFormat() + '\n'));
out.write(DataHelper.getUTF8("size=" + entry.getCompleteSize() + '\n'));
out.write(DataHelper.getUTF8("blog=" + entry.getURI().getKeyHash().toBase64() + '\n'));
out.write(DataHelper.getUTF8("entry=" + entry.getURI().getEntryId() + '\n'));
} finally {
out.close();
}
}
private void extractEntry(EntryContainer entry, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, ENTRY));
out.write(DataHelper.getUTF8(entry.getEntry().getText()));
} finally {
out.close();
}
}
private void extractAttachmentData(int num, Attachment attachment, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, ATTACHMENT_PREFIX + num + ATTACHMENT_DATA_SUFFIX));
//out.write(attachment.getData());
InputStream data = attachment.getDataStream();
byte buf[] = new byte[1024];
int read = 0;
while ( (read = data.read(buf)) != -1)
out.write(buf, 0, read);
data.close();
} finally {
out.close();
}
}
private void extractAttachmentMetadata(int num, Attachment attachment, File entryDir) throws IOException {
FileOutputStream out = null;
try {
out = new FileOutputStream(new File(entryDir, ATTACHMENT_PREFIX + num + ATTACHMENT_META_SUFFIX));
Map meta = attachment.getMeta();
for (Iterator iter = meta.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = (String)meta.get(k);
out.write(DataHelper.getUTF8(k + '=' + v + '\n'));
}
out.write(DataHelper.getUTF8(ATTACHMENT_DATA_SIZE + '=' + attachment.getDataLength()));
} finally {
out.close();
}
}
}

View File

@@ -0,0 +1,231 @@
package net.i2p.syndie;
import java.io.UnsupportedEncodingException;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
/**
* User session state and preferences.
*
*/
public class User {
private I2PAppContext _context;
private String _username;
private String _hashedPassword;
private Hash _blog;
private long _mostRecentEntry;
/** Group name to List of blog selectors, where the selectors are of the form
* blog://$key, entry://$key/$entryId, blogtag://$key/$tag, tag://$tag
*/
private Map _blogGroups;
/** list of blogs (Hash) we never want to see entries from */
private List _shitlistedBlogs;
/** where our userhosts.txt is */
private String _addressbookLocation;
private boolean _showImagesByDefault;
private boolean _showExpandedByDefault;
private String _defaultSelector;
private long _lastLogin;
private long _lastMetaEntry;
private boolean _allowAccessRemote;
private boolean _authenticated;
private String _eepProxyHost;
private int _eepProxyPort;
private String _webProxyHost;
private int _webProxyPort;
private String _torProxyHost;
private int _torProxyPort;
public User() {
_context = I2PAppContext.getGlobalContext();
init();
}
private void init() {
_authenticated = false;
_username = null;
_hashedPassword = null;
_blog = null;
_mostRecentEntry = -1;
_blogGroups = new HashMap();
_shitlistedBlogs = new ArrayList();
_defaultSelector = null;
_addressbookLocation = "userhosts.txt";
_showImagesByDefault = false;
_showExpandedByDefault = false;
_allowAccessRemote = false;
_eepProxyHost = null;
_webProxyHost = null;
_torProxyHost = null;
_eepProxyPort = -1;
_webProxyPort = -1;
_torProxyPort = -1;
_lastLogin = -1;
_lastMetaEntry = 0;
}
public boolean getAuthenticated() { return _authenticated; }
public String getUsername() { return _username; }
public Hash getBlog() { return _blog; }
public String getBlogStr() { return Base64.encode(_blog.getData()); }
public long getMostRecentEntry() { return _mostRecentEntry; }
public Map getBlogGroups() { return _blogGroups; }
public List getShitlistedBlogs() { return _shitlistedBlogs; }
public String getAddressbookLocation() { return _addressbookLocation; }
public boolean getShowImages() { return _showImagesByDefault; }
public boolean getShowExpanded() { return _showExpandedByDefault; }
public long getLastLogin() { return _lastLogin; }
public String getHashedPassword() { return _hashedPassword; }
public long getLastMetaEntry() { return _lastMetaEntry; }
public String getDefaultSelector() { return _defaultSelector; }
public void setDefaultSelector(String sel) { _defaultSelector = sel; }
public boolean getAllowAccessRemote() { return _allowAccessRemote; }
public void setAllowAccessRemote(boolean allow) { _allowAccessRemote = true; }
public void setMostRecentEntry(long id) { _mostRecentEntry = id; }
public void setLastMetaEntry(long id) { _lastMetaEntry = id; }
public String getEepProxyHost() { return _eepProxyHost; }
public int getEepProxyPort() { return _eepProxyPort; }
public String getWebProxyHost() { return _webProxyHost; }
public int getWebProxyPort() { return _webProxyPort; }
public String getTorProxyHost() { return _torProxyHost; }
public int getTorProxyPort() { return _torProxyPort; }
public void invalidate() {
BlogManager.instance().saveUser(this);
init();
}
public String login(String login, String pass, Properties props) {
String expectedPass = props.getProperty("password");
String hpass = Base64.encode(_context.sha().calculateHash(DataHelper.getUTF8(pass)).getData());
if (!hpass.equals(expectedPass)) {
_authenticated = false;
return "Incorrect password";
}
_username = login;
_hashedPassword = expectedPass;
// blog=luS9d3uaf....HwAE=
String b = props.getProperty("blog");
if (b != null) _blog = new Hash(Base64.decode(b));
// lastid=12345
String id = props.getProperty("lastid");
if (id != null) try { _mostRecentEntry = Long.parseLong(id); } catch (NumberFormatException nfe) {}
// lastmetaedition=12345
id = props.getProperty("lastmetaedition");
if (id != null) try { _lastMetaEntry = Long.parseLong(id); } catch (NumberFormatException nfe) {}
// groups=abc:selector,selector,selector,selector def:selector,selector,selector
StringTokenizer tok = new StringTokenizer(props.getProperty("groups", ""), " ");
while (tok.hasMoreTokens()) {
String group = tok.nextToken();
int endName = group.indexOf(':');
if (endName <= 0)
continue;
String groupName = group.substring(0, endName);
String sel = group.substring(endName+1);
List selectors = new ArrayList();
while ( (sel != null) && (sel.length() > 0) ) {
int end = sel.indexOf(',');
if (end < 0) {
selectors.add(sel);
sel = null;
} else {
if (end + 1 >= sel.length()) {
selectors.add(sel.substring(0,end));
sel = null;
} else if (end == 0) {
sel = sel.substring(1);
} else {
selectors.add(sel.substring(0, end));
sel = sel.substring(end+1);
}
}
}
_blogGroups.put(groupName.trim(), selectors);
}
// shitlist=hash,hash,hash
tok = new StringTokenizer(props.getProperty("shitlistedblogs", ""), ",");
while (tok.hasMoreTokens()) {
String blog = tok.nextToken();
byte bl[] = Base64.decode(blog);
if ( (bl != null) && (bl.length == Hash.HASH_LENGTH) )
_shitlistedBlogs.add(new Hash(bl));
}
String addr = props.getProperty("addressbook", "userhosts.txt");
if (addr != null)
_addressbookLocation = addr;
String show = props.getProperty("showimages", "false");
_showImagesByDefault = (show != null) && (show.equals("true"));
show = props.getProperty("showexpanded", "false");
_showExpandedByDefault = (show != null) && (show.equals("true"));
_defaultSelector = props.getProperty("defaultselector");
String allow = props.getProperty("allowaccessremote", "false");
_allowAccessRemote = (allow != null) && (allow.equals("true"));
_eepProxyPort = getInt(props.getProperty("eepproxyport"));
_webProxyPort = getInt(props.getProperty("webproxyport"));
_torProxyPort = getInt(props.getProperty("torproxyport"));
_eepProxyHost = props.getProperty("eepproxyhost");
_webProxyHost = props.getProperty("webproxyhost");
_torProxyHost = props.getProperty("torproxyhost");
_lastLogin = _context.clock().now();
_authenticated = true;
return LOGIN_OK;
}
private int getInt(String val) {
if (val == null) return -1;
try { return Integer.parseInt(val); } catch (NumberFormatException nfe) { return -1; }
}
public static final String LOGIN_OK = "Logged in";
public String export() {
StringBuffer buf = new StringBuffer(512);
buf.append("password=" + getHashedPassword() + "\n");
buf.append("blog=" + getBlog().toBase64() + "\n");
buf.append("lastid=" + getMostRecentEntry() + "\n");
buf.append("lastmetaedition=" + getLastMetaEntry() + "\n");
buf.append("lastlogin=" + getLastLogin() + "\n");
buf.append("addressbook=" + getAddressbookLocation() + "\n");
buf.append("showimages=" + getShowImages() + "\n");
buf.append("showexpanded=" + getShowExpanded() + "\n");
buf.append("defaultselector=" + getDefaultSelector() + "\n");
buf.append("allowaccessremote=" + _allowAccessRemote + "\n");
buf.append("groups=");
Map groups = getBlogGroups();
for (Iterator iter = groups.keySet().iterator(); iter.hasNext(); ) {
String name = (String)iter.next();
List selectors = (List)groups.get(name);
buf.append(name).append(':');
for (int i = 0; i < selectors.size(); i++) {
buf.append(selectors.get(i));
if (i + 1 < selectors.size())
buf.append(",");
}
if (iter.hasNext())
buf.append(' ');
}
buf.append('\n');
// shitlist=hash,hash,hash
List shitlistedBlogs = getShitlistedBlogs();
if (shitlistedBlogs.size() > 0) {
buf.setLength(0);
buf.append("shitlistedblogs=");
for (int i = 0; i < shitlistedBlogs.size(); i++) {
Hash blog = (Hash)shitlistedBlogs.get(i);
buf.append(blog.toBase64());
if (i + 1 < shitlistedBlogs.size())
buf.append(',');
}
buf.append('\n');
}
return buf.toString();
}
}

View File

@@ -0,0 +1,11 @@
package net.i2p.syndie;
/**
*
*/
public class Version {
public static final String VERSION = "0-alpha";
public static final String BUILD = "0";
public static final String INDEX_VERSION = "1.0";
public static final String ID = "$Id$";
}

View File

@@ -0,0 +1,438 @@
package net.i2p.syndie.data;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.Archive;
import net.i2p.syndie.BlogManager;
/**
* Simple read-only summary of an archive
*/
public class ArchiveIndex {
protected String _version;
protected long _generatedOn;
protected int _allBlogs;
protected int _newBlogs;
protected int _allEntries;
protected int _newEntries;
protected long _totalSize;
protected long _newSize;
/** list of BlogSummary objects */
protected List _blogs;
/** list of Hash objects */
protected List _newestBlogs;
/** list of BlogURI objects */
protected List _newestEntries;
/** parent message to a set of replies, ordered with the oldest first */
protected Map _replies;
protected Properties _headers;
public ArchiveIndex() {
this(false); //true);
}
public ArchiveIndex(boolean shouldLoad) {
_blogs = new ArrayList();
_newestBlogs = new ArrayList();
_newestEntries = new ArrayList();
_headers = new Properties();
_replies = Collections.synchronizedMap(new HashMap());
_generatedOn = -1;
if (shouldLoad)
setIsLocal("true");
}
public String getVersion() { return _version; }
public Properties getHeaders() { return _headers; }
public int getAllBlogs() { return _allBlogs; }
public int getNewBlogs() { return _newBlogs; }
public int getAllEntries() { return _allEntries; }
public int getNewEntries() { return _newEntries; }
public long getTotalSize() { return _totalSize; }
public long getNewSize() { return _newSize; }
public long getGeneratedOn() { return _generatedOn; }
public String getNewSizeStr() {
if (_newSize < 1024) return _newSize + "";
if (_newSize < 1024*1024) return _newSize/1024 + "KB";
else return _newSize/(1024*1024) + "MB";
}
public String getTotalSizeStr() {
if (_totalSize < 1024) return _totalSize + "";
if (_totalSize < 1024*1024) return _totalSize/1024 + "KB";
else return _totalSize/(1024*1024) + "MB";
}
/** how many blogs/tags are indexed */
public int getIndexBlogs() { return _blogs.size(); }
/** get the blog used for the given blog/tag pair */
public Hash getBlog(int index) { return ((BlogSummary)_blogs.get(index)).blog; }
/** get the tag used for the given blog/tag pair */
public String getBlogTag(int index) { return ((BlogSummary)_blogs.get(index)).tag; }
/** get the highest entry ID for the given blog/tag pair */
public long getBlogLastUpdated(int index) { return ((BlogSummary)_blogs.get(index)).lastUpdated; }
/** get the entry count for the given blog/tag pair */
public int getBlogEntryCount(int index) { return ((BlogSummary)_blogs.get(index)).entries.size(); }
/** get the entry from the given blog/tag pair */
public BlogURI getBlogEntry(int index, int entryIndex) { return ((EntrySummary)((BlogSummary)_blogs.get(index)).entries.get(entryIndex)).entry; }
/** get the raw entry size (including attachments) from the given blog/tag pair */
public long getBlogEntrySizeKB(int index, int entryIndex) { return ((EntrySummary)((BlogSummary)_blogs.get(index)).entries.get(entryIndex)).size; }
public boolean getEntryIsKnown(BlogURI uri) { return getEntry(uri) != null; }
public long getBlogEntrySizeKB(BlogURI uri) {
EntrySummary entry = getEntry(uri);
if (entry == null) return -1;
return entry.size;
}
private EntrySummary getEntry(BlogURI uri) {
if ( (uri == null) || (uri.getKeyHash() == null) || (uri.getEntryId() < 0) ) return null;
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (summary.blog.equals(uri.getKeyHash())) {
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary entry = (EntrySummary)summary.entries.get(j);
if (entry.entry.equals(uri))
return entry;
}
}
}
return null;
}
public Set getBlogEntryTags(BlogURI uri) {
Set tags = new HashSet();
if ( (uri == null) || (uri.getKeyHash() == null) || (uri.getEntryId() < 0) ) return tags;
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (summary.blog.equals(uri.getKeyHash())) {
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary entry = (EntrySummary)summary.entries.get(j);
if (entry.entry.equals(uri)) {
tags.add(summary.tag);
break;
}
}
}
}
return tags;
}
/** how many 'new' blogs are listed */
public int getNewestBlogCount() { return _newestBlogs.size(); }
public Hash getNewestBlog(int index) { return (Hash)_newestBlogs.get(index); }
/** how many 'new' entries are listed */
public int getNewestBlogEntryCount() { return _newestEntries.size(); }
public BlogURI getNewestBlogEntry(int index) { return (BlogURI)_newestEntries.get(index); }
/** list of locally known tags (String) under the given blog */
public List getBlogTags(Hash blog) {
List rv = new ArrayList();
for (int i = 0; i < _blogs.size(); i++) {
if (getBlog(i).equals(blog))
rv.add(getBlogTag(i));
}
return rv;
}
/** list of unique blogs locally known (set of Hash) */
public Set getUniqueBlogs() {
Set rv = new HashSet();
for (int i = 0; i < _blogs.size(); i++)
rv.add(getBlog(i));
return rv;
}
public List getReplies(BlogURI uri) {
Set replies = (Set)_replies.get(uri);
if (replies == null) return Collections.EMPTY_LIST;
synchronized (replies) {
return new ArrayList(replies);
}
}
public void setLocation(String location) {
try {
File l = new File(location);
if (l.exists())
load(l);
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
public void setIsLocal(String val) {
if ("true".equals(val)) {
try {
File dir = BlogManager.instance().getArchive().getArchiveDir();
load(new File(dir, Archive.INDEX_FILE));
} catch (IOException ioe) {}
}
}
public void load(File location) throws IOException {
FileInputStream in = null;
try {
in = new FileInputStream(location);
load(in);
} finally {
if (in != null)
try { in.close(); } catch (IOException ioe) {}
}
}
/** load up the index from an archive.txt */
public void load(InputStream index) throws IOException {
_allBlogs = 0;
_allEntries = 0;
_newBlogs = 0;
_newEntries = 0;
_newSize = 0;
_totalSize = 0;
_version = null;
_blogs = new ArrayList();
_newestBlogs = new ArrayList();
_newestEntries = new ArrayList();
_headers = new Properties();
BufferedReader in = new BufferedReader(new InputStreamReader(index, "UTF-8"));
String line = null;
line = in.readLine();
if (line == null)
return;
if (!line.startsWith("SyndieVersion:"))
throw new IOException("Index is invalid - it starts with " + line);
_version = line.substring("SyndieVersion:".length()).trim();
if (!_version.startsWith("1."))
throw new IOException("Index is not supported, we only handle versions 1.*, but it is " + _version);
while ( (line = in.readLine()) != null) {
if (line.length() <= 0)
break;
if (line.startsWith("Blog:")) break;
int split = line.indexOf(':');
if (split <= 0) continue;
if (split >= line.length()-1) continue;
_headers.setProperty(line.substring(0, split), line.substring(split+1));
}
if (line != null) {
do {
if (!line.startsWith("Blog:"))
break;
loadBlog(line);
} while ( (line = in.readLine()) != null);
}
// ignore the first line that doesnt start with blog - its blank
while ( (line = in.readLine()) != null) {
int split = line.indexOf(':');
if (split <= 0) continue;
if (split >= line.length()-1) continue;
String key = line.substring(0, split);
String val = line.substring(split+1);
if (key.equals("AllBlogs"))
_allBlogs = getInt(val);
else if (key.equals("NewBlogs"))
_newBlogs = getInt(val);
else if (key.equals("AllEntries"))
_allEntries = getInt(val);
else if (key.equals("NewEntries"))
_newEntries = getInt(val);
else if (key.equals("TotalSize"))
_totalSize = getInt(val);
else if (key.equals("NewSize"))
_newSize = getInt(val);
else if (key.equals("NewestBlogs"))
_newestBlogs = parseNewestBlogs(val);
else if (key.equals("NewestEntries"))
_newestEntries = parseNewestEntries(val);
//else
// System.err.println("Key: " + key + " val: " + val);
}
}
/**
* Dig through the index for BlogURIs matching the given criteria, ordering the results by
* their own entryIds.
*
* @param out where to store the matches
* @param blog if set, what blog key must the entries be under
* @param tag if set, what tag must the entry be in
*
*/
public void selectMatchesOrderByEntryId(List out, Hash blog, String tag) {
TreeMap ordered = new TreeMap();
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (blog != null) {
if (!blog.equals(summary.blog))
continue;
}
if (tag != null) {
if (!tag.equals(summary.tag)) {
System.out.println("Tag [" + summary.tag + "] does not match the requested [" + tag + "] in " + summary.blog.toBase64());
if (false) {
StringBuffer b = new StringBuffer(tag.length()*2);
for (int j = 0; j < tag.length(); j++) {
b.append((int)tag.charAt(j));
b.append('.');
if (summary.tag.length() > j+1)
b.append((int)summary.tag.charAt(j));
else
b.append('_');
b.append(' ');
}
System.out.println("tag.summary: " + b.toString());
}
continue;
}
}
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary entry = (EntrySummary)summary.entries.get(j);
String k = (Long.MAX_VALUE-entry.entry.getEntryId()) + "-" + entry.entry.getKeyHash().toBase64();
ordered.put(k, entry.entry);
//System.err.println("Including match: " + k);
}
}
for (Iterator iter = ordered.values().iterator(); iter.hasNext(); ) {
BlogURI entry = (BlogURI)iter.next();
if (!out.contains(entry))
out.add(entry);
}
}
private static final int getInt(String val) {
try {
return Integer.parseInt(val.trim());
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return 0;
}
}
private List parseNewestBlogs(String vals) {
List rv = new ArrayList();
StringTokenizer tok = new StringTokenizer(vals, " \t\n");
while (tok.hasMoreTokens())
rv.add(new Hash(Base64.decode(tok.nextToken())));
return rv;
}
private List parseNewestEntries(String vals) {
List rv = new ArrayList();
StringTokenizer tok = new StringTokenizer(vals, " \t\n");
while (tok.hasMoreTokens())
rv.add(new BlogURI(tok.nextToken()));
return rv;
}
private void loadBlog(String line) throws IOException {
// Blog: hash YYYYMMDD tag\t[ yyyymmdd_n_sizeKB]*
StringTokenizer tok = new StringTokenizer(line.trim(), " \n\t");
if (tok.countTokens() < 4)
return;
tok.nextToken();
String keyStr = tok.nextToken();
Hash keyHash = new Hash(Base64.decode(keyStr));
String whenStr = tok.nextToken();
long when = getIndexDate(whenStr);
String tag = tok.nextToken();
BlogSummary summary = new BlogSummary();
summary.blog = keyHash;
summary.tag = tag.trim();
summary.lastUpdated = when;
summary.entries = new ArrayList();
while (tok.hasMoreTokens()) {
String entry = tok.nextToken();
long id = Archive.getEntryIdFromIndexName(entry);
int kb = Archive.getSizeFromIndexName(entry);
summary.entries.add(new EntrySummary(new BlogURI(keyHash, id), kb));
}
_blogs.add(summary);
}
private SimpleDateFormat _dateFmt = new SimpleDateFormat("yyyyMMdd", Locale.UK);
private long getIndexDate(String yyyymmdd) {
synchronized (_dateFmt) {
try {
return _dateFmt.parse(yyyymmdd).getTime();
} catch (ParseException pe) {
return -1;
}
}
}
private String getIndexDate(long when) {
synchronized (_dateFmt) {
return _dateFmt.format(new Date(when));
}
}
protected class BlogSummary {
Hash blog;
String tag;
long lastUpdated;
/** list of EntrySummary objects */
List entries;
public BlogSummary() {
entries = new ArrayList();
}
}
protected class EntrySummary {
BlogURI entry;
long size;
public EntrySummary(BlogURI uri, long kb) {
size = kb;
entry = uri;
}
}
/** export the index into an archive.txt */
public String toString() {
StringBuffer rv = new StringBuffer(1024);
rv.append("SyndieVersion: ").append(_version).append('\n');
for (Iterator iter = _headers.keySet().iterator(); iter.hasNext(); ) {
String key = (String)iter.next();
String val = _headers.getProperty(key);
rv.append(key).append(": ").append(val).append('\n');
}
for (int i = 0; i < _blogs.size(); i++) {
rv.append("Blog: ");
Hash blog = getBlog(i);
String tag = getBlogTag(i);
rv.append(Base64.encode(blog.getData())).append(' ');
rv.append(getIndexDate(getBlogLastUpdated(i))).append(' ');
rv.append(tag).append('\t');
int entries = getBlogEntryCount(i);
for (int j = 0; j < entries; j++) {
BlogURI entry = getBlogEntry(i, j);
long kb = getBlogEntrySizeKB(i, j);
rv.append(Archive.getIndexName(entry.getEntryId(), (int)kb*1024)).append(' ');
}
rv.append('\n');
}
rv.append('\n');
rv.append("AllBlogs: ").append(_allBlogs).append('\n');
rv.append("NewBlogs: ").append(_newBlogs).append('\n');
rv.append("AllEntries: ").append(_allEntries).append('\n');
rv.append("NewEntries: ").append(_newEntries).append('\n');
rv.append("TotalSize: ").append(_totalSize).append('\n');
rv.append("NewSize: ").append(_newSize).append('\n');
rv.append("NewestBlogs: ");
for (int i = 0; i < _newestBlogs.size(); i++)
rv.append(((Hash)(_newestBlogs.get(i))).toBase64()).append(' ');
rv.append('\n');
rv.append("NewestEntries: ");
for (int i = 0; i < _newestEntries.size(); i++)
rv.append(((BlogURI)_newestEntries.get(i)).toString()).append(' ');
rv.append('\n');
return rv.toString();
}
/** Usage: ArchiveIndex archive.txt */
public static void main(String args[]) {
try {
ArchiveIndex i = new ArchiveIndex();
i.load(new File(args[0]));
System.out.println(i.toString());
} catch (IOException ioe) { ioe.printStackTrace(); }
}
}

View File

@@ -0,0 +1,122 @@
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import net.i2p.data.DataHelper;
/**
*
*/
public class Attachment {
private byte _data[];
private byte _rawMetadata[];
private List _keys;
private List _values;
public Attachment(byte data[], byte metadata[]) {
_data = data;
_rawMetadata = metadata;
_keys = new ArrayList();
_values = new ArrayList();
parseMeta();
}
public static final String NAME = "Name";
public static final String DESCRIPTION = "Description";
public static final String MIMETYPE = "MimeType";
public Attachment(byte data[], String name, String description, String mimeType) {
_data = data;
_keys = new ArrayList();
_values = new ArrayList();
_keys.add(NAME);
_values.add(name);
if ( (description != null) && (description.trim().length() > 0) ) {
_keys.add(DESCRIPTION);
_values.add(description);
}
if ( (mimeType != null) && (mimeType.trim().length() > 0) ) {
_keys.add(MIMETYPE);
_values.add(mimeType);
}
createMeta();
}
public byte[] getData() { return _data; }
public int getDataLength() { return _data.length; }
public byte[] getRawMetadata() { return _rawMetadata; }
public InputStream getDataStream() throws IOException { return new ByteArrayInputStream(_data); }
public String getMeta(String key) {
for (int i = 0; i < _keys.size(); i++) {
if (key.equals(_keys.get(i)))
return (String)_values.get(i);
}
return null;
}
public String getName() { return getMeta(NAME); }
public String getDescription() { return getMeta(DESCRIPTION); }
public String getMimeType() { return getMeta(MIMETYPE); }
public void setMeta(String key, String val) {
for (int i = 0; i < _keys.size(); i++) {
if (key.equals(_keys.get(i))) {
_values.set(i, val);
return;
}
}
_keys.add(key);
_values.add(val);
}
public Map getMeta() {
Map rv = new HashMap(_keys.size());
for (int i = 0; i < _keys.size(); i++) {
String k = (String)_keys.get(i);
String v = (String)_values.get(i);
rv.put(k,v);
}
return rv;
}
private void createMeta() {
StringBuffer meta = new StringBuffer(64);
for (int i = 0; i < _keys.size(); i++) {
meta.append(_keys.get(i)).append(':').append(_values.get(i)).append('\n');
}
_rawMetadata = DataHelper.getUTF8(meta);
}
private void parseMeta() {
if (_rawMetadata == null) return;
String key = null;
String val = null;
int keyBegin = 0;
int valBegin = -1;
for (int i = 0; i < _rawMetadata.length; i++) {
if (_rawMetadata[i] == ':') {
key = DataHelper.getUTF8(_rawMetadata, keyBegin, i - keyBegin);
valBegin = i + 1;
} else if (_rawMetadata[i] == '\n') {
val = DataHelper.getUTF8(_rawMetadata, valBegin, i - valBegin);
_keys.add(key);
_values.add(val);
keyBegin = i + 1;
key = null;
val = null;
}
}
}
public String toString() {
int len = 0;
if (_data != null)
len = _data.length;
return getName()
+ (getDescription() != null ? ": " + getDescription() : "")
+ (getMimeType() != null ? ", type: " + getMimeType() : "")
+ ", size: " + len;
}
}

View File

@@ -0,0 +1,277 @@
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.I2PAppContext;
/**
* Blog metadata. Formatted as: <pre>
* [key:val\n]*
* </pre>
*
* Required keys:
* Owner: base64 of their signing public key
* Signature: base64 of the DSA signature of the rest of the ordered metadata
* Edition: base10 unique identifier for this metadata (higher clobbers lower)
*
* Optional keys:
* Posters: comma delimited list of base64 signing public keys that
* can post to the blog
* Name: name of the blog
* Description: brief description of the blog
*
*/
public class BlogInfo {
private SigningPublicKey _key;
private SigningPublicKey _posters[];
private String _optionNames[];
private String _optionValues[];
private Signature _signature;
public BlogInfo() {}
public BlogInfo(SigningPublicKey key, SigningPublicKey posters[], Properties opts) {
_optionNames = new String[0];
_optionValues = new String[0];
setKey(key);
setPosters(posters);
for (Iterator iter = opts.keySet().iterator(); iter.hasNext(); ) {
String k = (String)iter.next();
String v = opts.getProperty(k);
setProperty(k.trim(), v.trim());
}
}
public SigningPublicKey getKey() { return _key; }
public void setKey(SigningPublicKey key) {
_key = key;
setProperty(OWNER_KEY, Base64.encode(key.getData()));
}
public static final String OWNER_KEY = "Owner";
public static final String POSTERS = "Posters";
public static final String SIGNATURE = "Signature";
public static final String NAME = "Name";
public static final String DESCRIPTION = "Description";
public static final String EDITION = "Edition";
public void load(InputStream in) throws IOException {
BufferedReader reader = new BufferedReader(new InputStreamReader(in, "UTF-8"));
List names = new ArrayList();
List vals = new ArrayList();
String line = null;
while ( (line = reader.readLine()) != null) {
System.err.println("Read info line [" + line + "]");
line = line.trim();
int len = line.length();
int split = line.indexOf(':');
if ( (len <= 0) || (split <= 0) ) {
continue;
} else if (split >= len - 1) {
names.add(line.substring(0, split).trim());
vals.add("");
continue;
}
String key = line.substring(0, split).trim();
String val = line.substring(split+1).trim();
names.add(key);
vals.add(val);
}
_optionNames = new String[names.size()];
_optionValues = new String[names.size()];
for (int i = 0; i < _optionNames.length; i++) {
_optionNames[i] = (String)names.get(i);
_optionValues[i] = (String)vals.get(i);
System.out.println("Loaded info: [" + _optionNames[i] + "] = [" + _optionValues[i] + "]");
}
String keyStr = getProperty(OWNER_KEY);
if (keyStr == null) throw new IOException("Owner not found");
_key = new SigningPublicKey(Base64.decode(keyStr));
String postersStr = getProperty(POSTERS);
if (postersStr != null) {
StringTokenizer tok = new StringTokenizer(postersStr, ", \t");
_posters = new SigningPublicKey[tok.countTokens()];
for (int i = 0; tok.hasMoreTokens(); i++)
_posters[i] = new SigningPublicKey(Base64.decode(tok.nextToken()));
}
String sigStr = getProperty(SIGNATURE);
if (sigStr == null) throw new IOException("Signature not found");
_signature = new Signature(Base64.decode(sigStr));
}
public void write(OutputStream out) throws IOException { write(out, true); }
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
StringBuffer buf = new StringBuffer(512);
for (int i = 0; i < _optionNames.length; i++) {
if ( (includeRealSignature) || (!SIGNATURE.equals(_optionNames[i])) )
buf.append(_optionNames[i]).append(':').append(_optionValues[i]).append('\n');
}
String s = buf.toString();
out.write(s.getBytes("UTF-8"));
}
public String getProperty(String name) {
for (int i = 0; i < _optionNames.length; i++) {
if (_optionNames[i].equals(name)) {
String val = _optionValues[i];
System.out.println("getProperty[" + name + "] = [" + val + "] [sz=" + val.length() +"]");
for (int j = 0; j < val.length(); j++) {
char c = (char)val.charAt(j);
if (c != (c & 0x7F))
System.out.println("char " + j + ": " + (int)c);
}
return val;
}
}
return null;
}
private void setProperty(String name, String val) {
for (int i = 0; i < _optionNames.length; i++) {
if (_optionNames[i].equals(name)) {
_optionValues[i] = val;
return;
}
}
String names[] = new String[_optionNames.length + 1];
String values[] = new String[_optionValues.length + 1];
for (int i = 0; i < _optionNames.length; i++) {
names[i] = _optionNames[i];
values[i] = _optionValues[i];
}
names[names.length-1] = name;
values[values.length-1] = val;
_optionNames = names;
_optionValues = values;
}
public int getEdition() {
String e = getProperty(EDITION);
if (e != null) {
try {
return Integer.parseInt(e);
} catch (NumberFormatException nfe) {
return 0;
}
}
return 0;
}
public String[] getProperties() { return _optionNames; }
public SigningPublicKey[] getPosters() { return _posters; }
public void setPosters(SigningPublicKey posters[]) {
_posters = posters;
StringBuffer buf = new StringBuffer();
for (int i = 0; posters != null && i < posters.length; i++) {
buf.append(Base64.encode(posters[i].getData()));
if (i + 1 < posters.length)
buf.append(',');
}
setProperty(POSTERS, buf.toString());
}
public boolean verify(I2PAppContext ctx) {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream(512);
write(out, false);
out.close();
byte data[] = out.toByteArray();
return ctx.dsa().verifySignature(_signature, data, _key);
} catch (IOException ioe) {
return false;
}
}
public void sign(I2PAppContext ctx, SigningPrivateKey priv) {
try {
ByteArrayOutputStream out = new ByteArrayOutputStream(512);
write(out, false);
byte data[] = out.toByteArray();
Signature sig = ctx.dsa().sign(data, priv);
if (sig == null)
throw new IOException("wtf, why is the signature null? data.len = " + data.length + " priv: " + priv);
setProperty(SIGNATURE, Base64.encode(sig.getData()));
_signature = sig;
} catch (IOException ioe) {
ioe.printStackTrace();
}
}
public String toString() {
StringBuffer buf = new StringBuffer();
buf.append("Blog ").append(getKey().calculateHash().toBase64());
for (int i = 0; i < _optionNames.length; i++) {
if ( (!SIGNATURE.equals(_optionNames[i])) &&
(!OWNER_KEY.equals(_optionNames[i])) &&
(!SIGNATURE.equals(_optionNames[i])) )
buf.append(' ').append(_optionNames[i]).append(": ").append(_optionValues[i]);
}
if ( (_posters != null) && (_posters.length > 0) ) {
buf.append(" additional posts by");
for (int i = 0; i < _posters.length; i++) {
buf.append(' ').append(_posters[i].calculateHash().toBase64());
if (i + 1 < _posters.length)
buf.append(',');
}
}
return buf.toString();
}
private static final String TEST_STRING = "\u20AC\u00DF\u6771\u10400\u00F6";
public static void main(String args[]) {
I2PAppContext ctx = I2PAppContext.getGlobalContext();
if (true) {
try {
Object keys[] = ctx.keyGenerator().generateSigningKeypair();
SigningPublicKey pub = (SigningPublicKey)keys[0];
SigningPrivateKey priv = (SigningPrivateKey)keys[1];
Properties opts = new Properties();
opts.setProperty("Name", TEST_STRING);
opts.setProperty("Description", TEST_STRING);
opts.setProperty("Edition", "0");
opts.setProperty("ContactURL", TEST_STRING);
String nameOrig = opts.getProperty("Name");
BlogInfo info = new BlogInfo(pub, null, opts);
info.sign(ctx, priv);
boolean ok = info.verify(ctx);
System.err.println("sign&verify: " + ok);
FileOutputStream o = new FileOutputStream("bloginfo-test.dat");
info.write(o, true);
o.close();
FileInputStream i = new FileInputStream("bloginfo-test.dat");
byte buf[] = new byte[4096];
int sz = DataHelper.read(i, buf);
BlogInfo read = new BlogInfo();
read.load(new ByteArrayInputStream(buf, 0, sz));
ok = read.verify(ctx);
System.err.println("write to disk, verify read: " + ok);
System.err.println("Data: " + Base64.encode(buf, 0, sz));
System.err.println("Str : " + new String(buf, 0, sz));
System.err.println("Name ok? " + read.getProperty("Name").equals(TEST_STRING));
System.err.println("Desc ok? " + read.getProperty("Description").equals(TEST_STRING));
System.err.println("Name ok? " + read.getProperty("ContactURL").equals(TEST_STRING));
} catch (Exception e) { e.printStackTrace(); }
} else {
try {
FileInputStream in = new FileInputStream(args[0]);
BlogInfo info = new BlogInfo();
info.load(in);
boolean ok = info.verify(I2PAppContext.getGlobalContext());
System.out.println("OK? " + ok + " :" + info);
} catch (Exception e) { e.printStackTrace(); }
}
}
}

View File

@@ -0,0 +1,96 @@
package net.i2p.syndie.data;
import java.util.*;
import net.i2p.data.*;
/**
*
*/
public class BlogURI {
private Hash _blogHash;
private long _entryId;
public BlogURI() {
this(null, -1);
}
public BlogURI(Hash blogHash, long entryId) {
_blogHash = blogHash;
_entryId = entryId;
}
public BlogURI(String uri) {
if (uri.startsWith("blog://")) {
int off = "blog://".length();
_blogHash = new Hash(Base64.decode(uri.substring(off, off+44))); // 44 chars == base64(32 bytes)
int entryStart = uri.indexOf('/', off+1);
if (entryStart < 0) {
_entryId = -1;
} else {
try {
_entryId = Long.parseLong(uri.substring(entryStart+1).trim());
} catch (NumberFormatException nfe) {
_entryId = -1;
}
}
} else if (uri.startsWith("entry://")) {
int off = "entry://".length();
_blogHash = new Hash(Base64.decode(uri.substring(off, off+44))); // 44 chars == base64(32 bytes)
int entryStart = uri.indexOf('/', off+1);
if (entryStart < 0) {
_entryId = -1;
} else {
try {
_entryId = Long.parseLong(uri.substring(entryStart+1).trim());
} catch (NumberFormatException nfe) {
_entryId = -1;
}
}
} else {
_blogHash = null;
_entryId = -1;
}
}
public Hash getKeyHash() { return _blogHash; }
public long getEntryId() { return _entryId; }
public void setKeyHash(Hash hash) { _blogHash = hash; }
public void setEntryId(long id) { _entryId = id; }
public String toString() {
if ( (_blogHash == null) || (_blogHash.getData() == null) )
return "";
StringBuffer rv = new StringBuffer(64);
rv.append("blog://").append(Base64.encode(_blogHash.getData()));
rv.append('/');
if (_entryId >= 0)
rv.append(_entryId);
return rv.toString();
}
public boolean equals(Object obj) {
if (obj == null) return false;
if (obj.getClass() != getClass()) return false;
return DataHelper.eq(_entryId, ((BlogURI)obj)._entryId) &&
DataHelper.eq(_blogHash, ((BlogURI)obj)._blogHash);
}
public int hashCode() {
int rv = (int)_entryId;
if (_blogHash != null)
rv += _blogHash.hashCode();
return rv;
}
public static void main(String args[]) {
test("http://asdf/");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/");
test("blog://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/123456789");
test("entry://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/");
test("entry://Vq~AlW-r7OM763okVUFIDvVFzxOjpNNsAx0rFb2yaE8=/123456789");
}
private static void test(String uri) {
BlogURI u = new BlogURI(uri);
if (!u.toString().equals(uri))
System.err.println("Not a match: [" + uri + "] != [" + u.toString() + "]");
}
}

View File

@@ -0,0 +1,86 @@
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.I2PAppContext;
/**
* Create a new blog metadata & set of entries using some crazy UTF8 encoded chars,
* then make sure they're always valid. These blogs & entries can then be fed into
* jetty/syndie/etc to see how and where they are getting b0rked.
*/
public class EncodingTestGenerator {
public EncodingTestGenerator() {}
public static final String TEST_STRING = "\u20AC\u00DF\u6771\u10400\u00F6";
public static void main(String args[]) {
I2PAppContext ctx = I2PAppContext.getGlobalContext();
try {
Object keys[] = ctx.keyGenerator().generateSigningKeypair();
SigningPublicKey pub = (SigningPublicKey)keys[0];
SigningPrivateKey priv = (SigningPrivateKey)keys[1];
Properties opts = new Properties();
opts.setProperty("Name", TEST_STRING);
opts.setProperty("Description", TEST_STRING);
opts.setProperty("Edition", "0");
opts.setProperty("ContactURL", TEST_STRING);
String nameOrig = opts.getProperty("Name");
BlogInfo info = new BlogInfo(pub, null, opts);
info.sign(ctx, priv);
boolean ok = info.verify(ctx);
System.err.println("sign&verify: " + ok);
FileOutputStream o = new FileOutputStream("encodedMeta.dat");
info.write(o, true);
o.close();
FileInputStream i = new FileInputStream("encodedMeta.dat");
byte buf[] = new byte[4096];
int sz = DataHelper.read(i, buf);
BlogInfo read = new BlogInfo();
read.load(new ByteArrayInputStream(buf, 0, sz));
ok = read.verify(ctx);
System.err.println("write to disk, verify read: " + ok);
System.err.println("Name ok? " + read.getProperty("Name").equals(TEST_STRING));
System.err.println("Desc ok? " + read.getProperty("Description").equals(TEST_STRING));
System.err.println("Name ok? " + read.getProperty("ContactURL").equals(TEST_STRING));
// ok now lets create some entries
BlogURI uri = new BlogURI(read.getKey().calculateHash(), 0);
String tags[] = new String[4];
for (int j = 0; j < tags.length; j++)
tags[j] = TEST_STRING + "_" + j;
StringBuffer smlOrig = new StringBuffer(512);
smlOrig.append("Subject: ").append(TEST_STRING).append("\n\n");
smlOrig.append("Hi with ").append(TEST_STRING);
EntryContainer container = new EntryContainer(uri, tags, DataHelper.getUTF8(smlOrig));
container.seal(ctx, priv, null);
ok = container.verifySignature(ctx, read);
System.err.println("Sealed and verified entry: " + ok);
FileOutputStream fos = new FileOutputStream("encodedEntry.dat");
container.write(fos, true);
fos.close();
System.out.println("Written to " + new File("encodedEntry.dat").getAbsolutePath());
FileInputStream fis = new FileInputStream("encodedEntry.dat");
EntryContainer read2 = new EntryContainer();
read2.load(fis);
ok = read2.verifySignature(ctx, read);
System.out.println("Read ok? " + ok);
read2.parseRawData(ctx);
String tagsRead[] = read2.getTags();
for (int j = 0; j < tagsRead.length; j++) {
if (!tags[j].equals(tagsRead[j]))
System.err.println("Tag error [" + j + "]: read = [" + tagsRead[j] + "] want [" + tags[j] + "]");
else
System.err.println("Tag ok [" + j + "]");
}
String readText = read2.getEntry().getText();
ok = readText.equals(smlOrig.toString());
System.err.println("SML text ok? " + ok);
} catch (Exception e) { e.printStackTrace(); }
}
}

View File

@@ -0,0 +1,14 @@
package net.i2p.syndie.data;
/**
*
*/
public class Entry {
private String _text;
public Entry(String raw) {
_text = raw;
}
public String getText() { return _text; }
}

View File

@@ -0,0 +1,426 @@
package net.i2p.syndie.data;
import java.io.*;
import java.util.*;
import java.util.zip.*;
import net.i2p.data.*;
import net.i2p.I2PAppContext;
/**
* Securely wrap up an entry and any attachments. Container format:<pre>
* $format\n
* [$key: $val\n]*
* \n
* Signature: $base64(DSA signature)\n
* Size: sizeof(data)\n
* [data bytes]
* </pre>
*
* Required keys:
* BlogKey: base64 of the SHA256 of the blog's public key
* BlogTags: tab delimited list of tags under which this entry should be organized
* BlogEntryId: base10 unique identifier of this entry within the key/path. Typically starts
* as the current day (in unix time, milliseconds) plus further milliseconds for
* each entry within the day.
*
* The data bytes contains zip file, either in the clear or encrypted. If the format
* is encrypted, the BlogPath key will (likely) be encrypted as well.
*
*/
public class EntryContainer {
private List _rawKeys;
private List _rawValues;
private Signature _signature;
private byte _rawData[];
private BlogURI _entryURI;
private int _format;
private Entry _entryData;
private Attachment _attachments[];
private int _completeSize;
public static final int FORMAT_ZIP_UNENCRYPTED = 0;
public static final int FORMAT_ZIP_ENCRYPTED = 1;
public static final String FORMAT_ZIP_UNENCRYPTED_STR = "syndie.entry.zip-unencrypted";
public static final String FORMAT_ZIP_ENCRYPTED_STR = "syndie.entry.zip-encrypted";
public static final String HEADER_BLOGKEY = "BlogKey";
public static final String HEADER_BLOGTAGS = "BlogTags";
public static final String HEADER_ENTRYID = "BlogEntryId";
public EntryContainer() {
_rawKeys = new ArrayList();
_rawValues = new ArrayList();
_completeSize = -1;
}
public EntryContainer(BlogURI uri, String tags[], byte smlData[]) {
this();
_entryURI = uri;
_entryData = new Entry(DataHelper.getUTF8(smlData));
setHeader(HEADER_BLOGKEY, Base64.encode(uri.getKeyHash().getData()));
StringBuffer buf = new StringBuffer();
for (int i = 0; tags != null && i < tags.length; i++)
buf.append(tags[i]).append('\t');
setHeader(HEADER_BLOGTAGS, buf.toString());
if (uri.getEntryId() < 0)
uri.setEntryId(System.currentTimeMillis());
setHeader(HEADER_ENTRYID, Long.toString(uri.getEntryId()));
}
public int getFormat() { return _format; }
private String readLine(InputStream in) throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream(512);
int i = 0;
while (true) {
int c = in.read();
if ( (c == (int)'\n') || (c == (int)'\r') ) {
break;
} else if (c == -1) {
if (i == 0)
return null;
else
break;
} else {
baos.write(c);
}
i++;
}
return DataHelper.getUTF8(baos.toByteArray());
//BufferedReader r = new BufferedReader(new InputStreamReader(in, "UTF-8"), 1);
//String line = r.readLine();
//return line;
}
public void load(InputStream source) throws IOException {
String line = readLine(source);
if (line == null) throw new IOException("No format line in the entry");
//System.err.println("read container format line [" + line + "]");
String fmt = line.trim();
if (FORMAT_ZIP_UNENCRYPTED_STR.equals(fmt)) {
_format = FORMAT_ZIP_UNENCRYPTED;
} else if (FORMAT_ZIP_ENCRYPTED_STR.equals(fmt)) {
_format = FORMAT_ZIP_ENCRYPTED;
} else {
throw new IOException("Unsupported entry format: " + fmt);
}
while ( (line = readLine(source)) != null) {
//System.err.println("read container header line [" + line + "]");
line = line.trim();
int len = line.length();
if (len <= 0)
break;
int split = line.indexOf(':');
if ( (split <= 0) || (split >= len - 2) )
throw new IOException("Invalid format of the syndie entry: line=" + line);
String key = line.substring(0, split);
String val = line.substring(split+1);
_rawKeys.add(key);
_rawValues.add(val);
}
parseHeaders();
String sigStr = readLine(source);
//System.err.println("read container signature line [" + line + "]");
if ( (sigStr == null) || (sigStr.indexOf("Signature:") == -1) )
throw new IOException("No signature line");
sigStr = sigStr.substring("Signature:".length()+1).trim();
_signature = new Signature(Base64.decode(sigStr));
//System.out.println("Sig: " + _signature.toBase64());
line = readLine(source);
//System.err.println("read container size line [" + line + "]");
if (line == null)
throw new IOException("No size line");
line = line.trim();
int dataSize = -1;
try {
int index = line.indexOf("Size:");
if (index == 0)
dataSize = Integer.parseInt(line.substring("Size:".length()+1).trim());
else
throw new IOException("Invalid size line");
} catch (NumberFormatException nfe) {
throw new IOException("Invalid entry size: " + line);
}
byte data[] = new byte[dataSize];
int read = DataHelper.read(source, data);
if (read != dataSize)
throw new IOException("Incomplete entry: read " + read + " expected " + dataSize);
_rawData = data;
}
public void seal(I2PAppContext ctx, SigningPrivateKey signingKey, SessionKey entryKey) throws IOException {
System.out.println("Sealing " + _entryURI);
if (entryKey == null)
_format = FORMAT_ZIP_UNENCRYPTED;
else
_format = FORMAT_ZIP_ENCRYPTED;
setHeader(HEADER_BLOGKEY, Base64.encode(_entryURI.getKeyHash().getData()));
if (_entryURI.getEntryId() < 0)
_entryURI.setEntryId(ctx.clock().now());
setHeader(HEADER_ENTRYID, Long.toString(_entryURI.getEntryId()));
_rawData = createRawData(ctx, entryKey);
ByteArrayOutputStream baos = new ByteArrayOutputStream(1024);
write(baos, false);
byte data[] = baos.toByteArray();
_signature = ctx.dsa().sign(data, signingKey);
}
private byte[] createRawData(I2PAppContext ctx, SessionKey entryKey) throws IOException {
byte raw[] = createRawData();
if (entryKey != null) {
byte iv[] = new byte[16];
ctx.random().nextBytes(iv);
byte rv[] = new byte[raw.length + iv.length];
ctx.aes().encrypt(raw, 0, rv, iv.length, entryKey, iv, raw.length);
System.arraycopy(iv, 0, rv, 0, iv.length);
return rv;
} else {
return raw;
}
}
private byte[] createRawData() throws IOException {
ByteArrayOutputStream baos = new ByteArrayOutputStream();
ZipOutputStream out = new ZipOutputStream(baos);
ZipEntry ze = new ZipEntry(ZIP_ENTRY);
byte data[] = DataHelper.getUTF8(_entryData.getText());
ze.setTime(0);
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
for (int i = 0; (_attachments != null) && (i < _attachments.length); i++) {
ze = new ZipEntry(ZIP_ATTACHMENT_PREFIX + i + ZIP_ATTACHMENT_SUFFIX);
data = _attachments[i].getData();
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
ze = new ZipEntry(ZIP_ATTACHMENT_META_PREFIX + i + ZIP_ATTACHMENT_META_SUFFIX);
data = _attachments[i].getRawMetadata();
out.putNextEntry(ze);
out.write(data);
out.closeEntry();
}
out.finish();
out.close();
return baos.toByteArray();
}
public static final String ZIP_ENTRY = "entry.sml";
public static final String ZIP_ATTACHMENT_PREFIX = "attachmentdata";
public static final String ZIP_ATTACHMENT_SUFFIX = ".szd";
public static final String ZIP_ATTACHMENT_META_PREFIX = "attachmentmeta";
public static final String ZIP_ATTACHMENT_META_SUFFIX = ".szm";
public void parseRawData(I2PAppContext ctx) throws IOException { parseRawData(ctx, null); }
public void parseRawData(I2PAppContext ctx, SessionKey zipKey) throws IOException {
int dataOffset = 0;
if (zipKey != null) {
byte iv[] = new byte[16];
System.arraycopy(_rawData, 0, iv, 0, iv.length);
ctx.aes().decrypt(_rawData, iv.length, _rawData, iv.length, zipKey, iv, _rawData.length - iv.length);
dataOffset = iv.length;
}
ByteArrayInputStream in = new ByteArrayInputStream(_rawData, dataOffset, _rawData.length - dataOffset);
ZipInputStream zi = new ZipInputStream(in);
Map attachments = new HashMap();
Map attachmentMeta = new HashMap();
while (true) {
ZipEntry entry = zi.getNextEntry();
if (entry == null)
break;
ByteArrayOutputStream out = new ByteArrayOutputStream(1024);
byte buf[] = new byte[1024];
int read = -1;
while ( (read = zi.read(buf)) != -1)
out.write(buf, 0, read);
byte entryData[] = out.toByteArray();
String name = entry.getName();
if (ZIP_ENTRY.equals(name)) {
_entryData = new Entry(DataHelper.getUTF8(entryData));
} else if (name.startsWith(ZIP_ATTACHMENT_PREFIX)) {
attachments.put(name, (Object)entryData);
} else if (name.startsWith(ZIP_ATTACHMENT_META_PREFIX)) {
attachmentMeta.put(name, (Object)entryData);
}
//System.out.println("Read entry [" + name + "] with size=" + entryData.length);
}
_attachments = new Attachment[attachments.size()];
for (int i = 0; i < attachments.size(); i++) {
byte data[] = (byte[])attachments.get(ZIP_ATTACHMENT_PREFIX + i + ZIP_ATTACHMENT_SUFFIX);
byte metadata[] = (byte[])attachmentMeta.get(ZIP_ATTACHMENT_META_PREFIX + i + ZIP_ATTACHMENT_META_SUFFIX);
if ( (data != null) && (metadata != null) )
_attachments[i] = new Attachment(data, metadata);
else
System.out.println("Unable to get " + i + ": " + data + "/" + metadata);
}
//System.out.println("Attachments: " + _attachments.length + "/" + attachments.size() + ": " + attachments);
}
public BlogURI getURI() { return _entryURI; }
private static final String NO_TAGS[] = new String[0];
public String[] getTags() {
String tags = getHeader(HEADER_BLOGTAGS);
if ( (tags == null) || (tags.trim().length() <= 0) ) {
return NO_TAGS;
} else {
StringTokenizer tok = new StringTokenizer(tags, "\t");
String rv[] = new String[tok.countTokens()];
for (int i = 0; i < rv.length; i++)
rv[i] = tok.nextToken().trim();
return rv;
}
}
public Signature getSignature() { return _signature; }
public Entry getEntry() { return _entryData; }
public Attachment[] getAttachments() { return _attachments; }
public void setCompleteSize(int bytes) { _completeSize = bytes; }
public int getCompleteSize() { return _completeSize; }
public String getHeader(String key) {
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
if (k.equals(key))
return (String)_rawValues.get(i);
}
return null;
}
public Map getHeaders() {
Map rv = new HashMap(_rawKeys.size());
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
String v = (String)_rawValues.get(i);
rv.put(k,v);
}
return rv;
}
public void setHeader(String name, String val) {
int index = _rawKeys.indexOf(name);
if (index < 0) {
_rawKeys.add(name);
_rawValues.add(val);
} else {
_rawValues.set(index, val);
}
}
public void addAttachment(byte data[], String name, String description, String mimeType) {
Attachment a = new Attachment(data, name, description, mimeType);
int old = (_attachments == null ? 0 : _attachments.length);
Attachment nv[] = new Attachment[old+1];
if (old > 0)
for (int i = 0; i < old; i++)
nv[i] = _attachments[i];
nv[old] = a;
_attachments = nv;
}
private void parseHeaders() throws IOException {
String keyHash = getHeader(HEADER_BLOGKEY);
String idVal = getHeader(HEADER_ENTRYID);
if (keyHash == null) {
System.err.println("Headers: " + _rawKeys);
System.err.println("Values : " + _rawValues);
throw new IOException("Missing " + HEADER_BLOGKEY + " header");
}
long entryId = -1;
if ( (idVal != null) && (idVal.length() > 0) ) {
try {
entryId = Long.parseLong(idVal.trim());
} catch (NumberFormatException nfe) {
System.err.println("Headers: " + _rawKeys);
System.err.println("Values : " + _rawValues);
throw new IOException("Invalid format of entryId (" + idVal + ")");
}
}
_entryURI = new BlogURI(new Hash(Base64.decode(keyHash)), entryId);
}
public boolean verifySignature(I2PAppContext ctx, BlogInfo info) {
if (_signature == null) throw new NullPointerException("sig is null");
if (info == null) throw new NullPointerException("info is null");
if (info.getKey() == null) throw new NullPointerException("info key is null");
if (info.getKey().getData() == null) throw new NullPointerException("info key data is null");
//System.out.println("Verifying " + _entryURI + " for " + info);
ByteArrayOutputStream out = new ByteArrayOutputStream(_rawData.length + 512);
try {
write(out, false);
byte dat[] = out.toByteArray();
//System.out.println("Raw data to verify: " + ctx.sha().calculateHash(dat).toBase64() + " sig: " + _signature.toBase64());
ByteArrayInputStream in = new ByteArrayInputStream(dat);
boolean ok = ctx.dsa().verifySignature(_signature, in, info.getKey());
if (!ok && info.getPosters() != null) {
for (int i = 0; !ok && i < info.getPosters().length; i++) {
in.reset();
ok = ctx.dsa().verifySignature(_signature, in, info.getPosters()[i]);
}
}
//System.out.println("Verified ok? " + ok + " key: " + info.getKey().calculateHash().toBase64());
//new Exception("verifying").printStackTrace();
return ok;
} catch (IOException ioe) {
//System.out.println("Verification failed! " + ioe.getMessage());
return false;
}
}
public void write(OutputStream out, boolean includeRealSignature) throws IOException {
StringBuffer buf = new StringBuffer(512);
switch (_format) {
case FORMAT_ZIP_ENCRYPTED:
buf.append(FORMAT_ZIP_ENCRYPTED_STR).append('\n');
break;
case FORMAT_ZIP_UNENCRYPTED:
buf.append(FORMAT_ZIP_UNENCRYPTED_STR).append('\n');
break;
default:
throw new IOException("Invalid format " + _format);
}
for (int i = 0; i < _rawKeys.size(); i++) {
String k = (String)_rawKeys.get(i);
buf.append(k.trim());
buf.append(": ");
buf.append(((String)_rawValues.get(i)).trim());
buf.append('\n');
}
buf.append('\n');
buf.append("Signature: ");
if (includeRealSignature)
buf.append(Base64.encode(_signature.getData()));
buf.append("\n");
buf.append("Size: ").append(_rawData.length).append('\n');
String str = buf.toString();
//System.out.println("Writing raw: \n[" + str + "] / " + I2PAppContext.getGlobalContext().sha().calculateHash(str.getBytes()) + ", raw data: " + I2PAppContext.getGlobalContext().sha().calculateHash(_rawData).toBase64() + "\n");
out.write(DataHelper.getUTF8(str));
out.write(_rawData);
}
public String toString() { return _entryURI.toString(); }
}

View File

@@ -0,0 +1,102 @@
package net.i2p.syndie.data;
import java.util.*;
import net.i2p.data.*;
import net.i2p.syndie.Archive;
/**
* writable archive index (most are readonly)
*/
public class LocalArchiveIndex extends ArchiveIndex {
public LocalArchiveIndex() {
super(false);
}
public void setGeneratedOn(long when) { _generatedOn = when; }
public void setVersion(String v) { _version = v; }
public void setHeaders(Properties headers) { _headers = headers; }
public void setHeader(String key, String val) { _headers.setProperty(key, val); }
public void setAllBlogs(int count) { _allBlogs = count; }
public void setNewBlogs(int count) { _newBlogs = count; }
public void setAllEntries(int count) { _allEntries = count; }
public void setNewEntries(int count) { _newEntries = count; }
public void setTotalSize(long bytes) { _totalSize = bytes; }
public void setNewSize(long bytes) { _newSize = bytes; }
public void addBlog(Hash key, String tag, long lastUpdated) {
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary s = (BlogSummary)_blogs.get(i);
if ( (s.blog.equals(key)) && (s.tag.equals(tag)) ) {
s.lastUpdated = Math.max(s.lastUpdated, lastUpdated);
return;
}
}
BlogSummary summary = new ArchiveIndex.BlogSummary();
summary.blog = key;
summary.tag = tag;
summary.lastUpdated = lastUpdated;
_blogs.add(summary);
}
public void addBlogEntry(Hash key, String tag, String entry) {
for (int i = 0; i < _blogs.size(); i++) {
BlogSummary summary = (BlogSummary)_blogs.get(i);
if (summary.blog.equals(key) && (summary.tag.equals(tag)) ) {
long entryId = Archive.getEntryIdFromIndexName(entry);
int kb = Archive.getSizeFromIndexName(entry);
System.out.println("Adding entry " + entryId + ", size=" + kb + "KB [" + entry + "]");
EntrySummary entrySummary = new EntrySummary(new BlogURI(key, entryId), kb);
for (int j = 0; j < summary.entries.size(); j++) {
EntrySummary cur = (EntrySummary)summary.entries.get(j);
if (cur.entry.equals(entrySummary.entry))
return;
}
summary.entries.add(entrySummary);
return;
}
}
}
public void addNewestBlog(Hash key) {
if (!_newestBlogs.contains(key))
_newestBlogs.add(key);
}
public void addNewestEntry(BlogURI entry) {
if (!_newestEntries.contains(entry))
_newestEntries.add(entry);
}
public void addReply(BlogURI parent, BlogURI reply) {
Set replies = (Set)_replies.get(parent);
if (replies == null) {
replies = Collections.synchronizedSet(new TreeSet(BlogURIComparator.HIGHEST_ID_FIRST));
_replies.put(parent, replies);
}
replies.add(reply);
//System.err.println("Adding reply to " + parent + " from child " + reply + " (# replies: " + replies.size() + ")");
}
private static class BlogURIComparator implements Comparator {
public static final BlogURIComparator HIGHEST_ID_FIRST = new BlogURIComparator(true);
public static final BlogURIComparator HIGHEST_ID_LAST = new BlogURIComparator(false);
private boolean _highestFirst;
public BlogURIComparator(boolean highestFirst) {
_highestFirst = highestFirst;
}
public int compare(Object lhs, Object rhs) {
if ( (lhs == null) || !(lhs instanceof BlogURI) ) return 1;
if ( (rhs == null) || !(rhs instanceof BlogURI) ) return -1;
BlogURI l = (BlogURI)lhs;
BlogURI r = (BlogURI)rhs;
if (l.getEntryId() > r.getEntryId())
return (_highestFirst ? 1 : -1);
else if (l.getEntryId() < r.getEntryId())
return (_highestFirst ? -1 : 1);
else
return DataHelper.compareTo(l.getKeyHash().getData(), r.getKeyHash().getData());
}
}
}

View File

@@ -0,0 +1,32 @@
package net.i2p.syndie.data;
/**
*
*/
public class SafeURL {
private String _schema;
private String _location;
private String _name;
private String _description;
public SafeURL(String raw) {
parse(raw);
}
private void parse(String raw) {
if (raw != null) {
int index = raw.indexOf("://");
if ( (index <= 0) || (index + 1 >= raw.length()) )
return;
_schema = raw.substring(0, index);
_location = raw.substring(index+3);
_location.replace('>', '_');
_location.replace('<', '^');
}
}
public String getSchema() { return _schema; }
public String getLocation() { return _location; }
public String toString() { return _schema + "://" + _location; }
}

View File

@@ -0,0 +1,81 @@
package net.i2p.syndie.data;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.Archive;
import net.i2p.syndie.BlogManager;
/**
* Simple read-only summary of an archive, proxied to the BlogManager's instance
*/
public class TransparentArchiveIndex extends ArchiveIndex {
public TransparentArchiveIndex() { super(false); }
private static ArchiveIndex index() { return BlogManager.instance().getArchive().getIndex(); }
public String getVersion() { return index().getVersion(); }
public Properties getHeaders() { return index().getHeaders(); }
public int getAllBlogs() { return index().getAllBlogs(); }
public int getNewBlogs() { return index().getNewBlogs(); }
public int getAllEntries() { return index().getAllEntries(); }
public int getNewEntries() { return index().getNewEntries(); }
public long getTotalSize() { return index().getTotalSize(); }
public long getNewSize() { return index().getNewSize(); }
public long getGeneratedOn() { return index().getGeneratedOn(); }
public String getNewSizeStr() { return index().getNewSizeStr(); }
public String getTotalSizeStr() { return index().getTotalSizeStr(); }
/** how many blogs/tags are indexed */
public int getIndexBlogs() { return index().getIndexBlogs(); }
/** get the blog used for the given blog/tag pair */
public Hash getBlog(int index) { return index().getBlog(index); }
/** get the tag used for the given blog/tag pair */
public String getBlogTag(int index) { return index().getBlogTag(index); }
/** get the highest entry ID for the given blog/tag pair */
public long getBlogLastUpdated(int index) { return index().getBlogLastUpdated(index); }
/** get the entry count for the given blog/tag pair */
public int getBlogEntryCount(int index) { return index().getBlogEntryCount(index); }
/** get the entry from the given blog/tag pair */
public BlogURI getBlogEntry(int index, int entryIndex) { return index().getBlogEntry(index, entryIndex); }
/** get the raw entry size (including attachments) from the given blog/tag pair */
public long getBlogEntrySizeKB(int index, int entryIndex) { return index().getBlogEntrySizeKB(index, entryIndex); }
public boolean getEntryIsKnown(BlogURI uri) { return index().getEntryIsKnown(uri); }
public long getBlogEntrySizeKB(BlogURI uri) { return index().getBlogEntrySizeKB(uri); }
public Set getBlogEntryTags(BlogURI uri) { return index().getBlogEntryTags(uri); }
/** how many 'new' blogs are listed */
public int getNewestBlogCount() { return index().getNewestBlogCount(); }
public Hash getNewestBlog(int index) { return index().getNewestBlog(index); }
/** how many 'new' entries are listed */
public int getNewestBlogEntryCount() { return index().getNewestBlogEntryCount(); }
public BlogURI getNewestBlogEntry(int index) { return index().getNewestBlogEntry(index); }
/** list of locally known tags (String) under the given blog */
public List getBlogTags(Hash blog) { return index().getBlogTags(blog); }
/** list of unique blogs locally known (set of Hash) */
public Set getUniqueBlogs() { return index().getUniqueBlogs(); }
public void setLocation(String location) { return; }
public void setIsLocal(String val) { return; }
public void load(File location) throws IOException { return; }
/** load up the index from an archive.txt */
public void load(InputStream index) throws IOException { return; }
/**
* Dig through the index for BlogURIs matching the given criteria, ordering the results by
* their own entryIds.
*
* @param out where to store the matches
* @param blog if set, what blog key must the entries be under
* @param tag if set, what tag must the entry be in
*
*/
public void selectMatchesOrderByEntryId(List out, Hash blog, String tag) {
index().selectMatchesOrderByEntryId(out, blog, tag);
}
/** export the index into an archive.txt */
public String toString() { return index().toString(); }
}

View File

@@ -0,0 +1,59 @@
package net.i2p.syndie.sml;
import java.util.List;
/**
*
*/
public class EventReceiverImpl implements SMLParser.EventReceiver {
public void receiveHeader(String header, String value) {
System.out.println("Receive header [" + header + "] = [" + value + "]");
}
public void receiveLink(String schema, String location, String text) {
System.out.println("Receive link [" + schema + "]/[" + location+ "]/[" + text + "]");
}
public void receiveBlog(String name, String blogKeyHash, String blogPath, long blogEntryId,
List blogArchiveLocations, String anchorText) {
System.out.println("Receive blog [" + name + "]/[" + blogKeyHash + "]/[" + blogPath
+ "]/[" + blogEntryId + "]/[" + blogArchiveLocations + "]/[" + anchorText + "]");
}
public void receiveArchive(String name, String description, String locationSchema, String location,
String postingKey, String anchorText) {
System.out.println("Receive archive [" + name + "]/[" + description + "]/[" + locationSchema
+ "]/[" + location + "]/[" + postingKey + "]/[" + anchorText + "]");
}
public void receiveImage(String alternateText, int attachmentId) {
System.out.println("Receive image [" + alternateText + "]/[" + attachmentId + "]");
}
public void receiveAddress(String name, String schema, String location, String anchorText) {
System.out.println("Receive address [" + name + "]/[" + schema + "]/[" + location + "]/[" + anchorText+ "]");
}
public void receiveBold(String text) { System.out.println("Receive bold [" + text+ "]"); }
public void receiveItalic(String text) { System.out.println("Receive italic [" + text+ "]"); }
public void receiveUnderline(String text) { System.out.println("Receive underline [" + text+ "]"); }
public void receiveQuote(String text, String whoQuoted, String quoteLocationSchema, String quoteLocation) {
System.out.println("Receive quote [" + text + "]/[" + whoQuoted + "]/[" + quoteLocationSchema + "]/[" + quoteLocation + "]");
}
public void receiveCode(String text, String codeLocationSchema, String codeLocation) {
System.out.println("Receive code [" + text+ "]/[" + codeLocationSchema + "]/[" + codeLocation + "]");
}
public void receiveCut(String summaryText) { System.out.println("Receive cut [" + summaryText + "]"); }
public void receivePlain(String text) { System.out.println("Receive plain [" + text + "]"); }
public void receiveNewline() { System.out.println("Receive NL"); }
public void receiveLT() { System.out.println("Receive LT"); }
public void receiveGT() { System.out.println("Receive GT"); }
public void receiveBegin() { System.out.println("Receive begin"); }
public void receiveEnd() { System.out.println("Receive end"); }
public void receiveHeaderEnd() { System.out.println("Receive header end"); }
public void receiveLeftBracket() { System.out.println("Receive ["); }
public void receiveRightBracket() { System.out.println("Receive ]"); }
public void receiveH1(String text) {}
public void receiveH2(String text) {}
public void receiveH3(String text) {}
public void receiveH4(String text) {}
public void receiveH5(String text) {}
public void receivePre(String text) {}
public void receiveHR() {}
public void receiveAttachment(int id, String anchorText) {}
}

View File

@@ -0,0 +1,121 @@
package net.i2p.syndie.sml;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.web.*;
/**
*
*/
public class HTMLPreviewRenderer extends HTMLRenderer {
private List _filenames;
private List _fileTypes;
private List _files;
public HTMLPreviewRenderer(List filenames, List fileTypes, List files) {
super();
_filenames = filenames;
_fileTypes = fileTypes;
_files = files;
}
protected String getAttachmentURLBase() { return "viewtempattachment.jsp"; }
protected String getAttachmentURL(int id) {
return getAttachmentURLBase() + "?" +
ArchiveViewerBean.PARAM_ATTACHMENT + "=" + id;
}
public void receiveAttachment(int id, String anchorText) {
if (!continueBody()) { return; }
if ( (id < 0) || (_files == null) || (id >= _files.size()) ) {
_bodyBuffer.append(sanitizeString(anchorText));
} else {
File f = (File)_files.get(id);
String name = (String)_filenames.get(id);
String type = (String)_fileTypes.get(id);
_bodyBuffer.append("<a href=\"").append(getAttachmentURL(id)).append("\">");
_bodyBuffer.append(sanitizeString(anchorText)).append("</a>");
_bodyBuffer.append(" (").append(f.length()/1024).append("KB, ");
_bodyBuffer.append(" \"").append(sanitizeString(name)).append("\", ");
_bodyBuffer.append(sanitizeString(type)).append(")");
}
}
public void receiveEnd() {
_postBodyBuffer.append("</td></tr>\n");
_postBodyBuffer.append("<tr>\n");
_postBodyBuffer.append("<form action=\"").append(getAttachmentURLBase()).append("\">\n");
_postBodyBuffer.append("<td colspan=\"2\" valign=\"top\" align=\"left\" class=\"syndieEntryAttachmentsCell\"\n");
if (_files.size() > 0) {
_postBodyBuffer.append("<b>Attachments:</b> ");
_postBodyBuffer.append("<select name=\"").append(ArchiveViewerBean.PARAM_ATTACHMENT).append("\">\n");
for (int i = 0; i < _files.size(); i++) {
_postBodyBuffer.append("<option value=\"").append(i).append("\">");
File f = (File)_files.get(i);
String name = (String)_filenames.get(i);
String type = (String)_fileTypes.get(i);
_postBodyBuffer.append(sanitizeString(name));
_postBodyBuffer.append(" (").append(f.length()/1024).append("KB");
_postBodyBuffer.append(", type ").append(sanitizeString(type)).append(")</option>\n");
}
_postBodyBuffer.append("</select>\n");
_postBodyBuffer.append("<input type=\"submit\" value=\"Download\" name=\"Download\" /><br />\n");
}
if (_blogs.size() > 0) {
_postBodyBuffer.append("<b>Blog references:</b> ");
for (int i = 0; i < _blogs.size(); i++) {
Blog b = (Blog)_blogs.get(i);
_postBodyBuffer.append("<a href=\"").append(getPageURL(new Hash(Base64.decode(b.hash)), b.tag, b.entryId, -1, -1, (_user != null ? _user.getShowExpanded() : false), (_user != null ? _user.getShowImages() : false)));
_postBodyBuffer.append("\">").append(sanitizeString(b.name)).append("</a> ");
}
_postBodyBuffer.append("<br />\n");
}
if (_links.size() > 0) {
_postBodyBuffer.append("<b>External links:</b> ");
for (int i = 0; i < _links.size(); i++) {
Link l = (Link)_links.get(i);
_postBodyBuffer.append("<a href=\"externallink.jsp?schema=");
_postBodyBuffer.append(sanitizeURL(l.schema)).append("&location=");
_postBodyBuffer.append(sanitizeURL(l.location));
_postBodyBuffer.append("\">").append(sanitizeString(l.location));
_postBodyBuffer.append(" (").append(sanitizeString(l.schema)).append(")</a> ");
}
_postBodyBuffer.append("<br />\n");
}
if (_addresses.size() > 0) {
_postBodyBuffer.append("<b>Addresses:</b> ");
for (int i = 0; i < _addresses.size(); i++) {
Address a = (Address)_addresses.get(i);
_postBodyBuffer.append("<a href=\"addaddress.jsp?schema=");
_postBodyBuffer.append(sanitizeURL(a.schema)).append("&location=");
_postBodyBuffer.append(sanitizeURL(a.location)).append("&name=");
_postBodyBuffer.append(sanitizeURL(a.name));
_postBodyBuffer.append("\">").append(sanitizeString(a.name));
}
_postBodyBuffer.append("<br />\n");
}
if (_archives.size() > 0) {
_postBodyBuffer.append("<b>Archives:</b>");
for (int i = 0; i < _archives.size(); i++) {
ArchiveRef a = (ArchiveRef)_archives.get(i);
_postBodyBuffer.append(" <a href=\"").append(getArchiveURL(null, new SafeURL(a.locationSchema + "://" + a.location)));
_postBodyBuffer.append("\">").append(sanitizeString(a.name)).append("</a>");
if (a.description != null)
_postBodyBuffer.append(": ").append(sanitizeString(a.description));
}
_postBodyBuffer.append("<br />\n");
}
_postBodyBuffer.append("</td>\n</form>\n</tr>\n");
_postBodyBuffer.append("</table>\n");
}
}

View File

@@ -0,0 +1,823 @@
package net.i2p.syndie.sml;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.web.*;
/**
*
*/
public class HTMLRenderer extends EventReceiverImpl {
protected SMLParser _parser;
protected Writer _out;
protected User _user;
protected Archive _archive;
protected EntryContainer _entry;
protected boolean _showImages;
protected boolean _cutBody;
protected boolean _cutReached;
protected int _cutSize;
protected int _lastNewlineAt;
protected Map _headers;
protected List _addresses;
protected List _links;
protected List _blogs;
protected List _archives;
protected StringBuffer _preBodyBuffer;
protected StringBuffer _bodyBuffer;
protected StringBuffer _postBodyBuffer;
public HTMLRenderer() {
_parser = new SMLParser();
}
/**
* Usage: HTMLRenderer smlFile outputFile
*/
public static void main(String args[]) {
if (args.length != 2) {
System.err.println("Usage: HTMLRenderer smlFile outputFile");
return;
}
HTMLRenderer renderer = new HTMLRenderer();
Writer out = null;
try {
ByteArrayOutputStream baos = new ByteArrayOutputStream(1024*512);
FileInputStream in = new FileInputStream(args[0]);
byte buf[] = new byte[1024];
int read = 0;
while ( (read = in.read(buf)) != -1)
baos.write(buf, 0, read);
out = new OutputStreamWriter(new FileOutputStream(args[1]), "UTF-8");
renderer.render(new User(), BlogManager.instance().getArchive(), null, DataHelper.getUTF8(baos.toByteArray()), out, false, true);
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (out != null) try { out.close(); } catch (IOException ioe) {}
}
}
public void renderUnknownEntry(User user, Archive archive, BlogURI uri, Writer out) throws IOException {
BlogInfo info = archive.getBlogInfo(uri);
if (info == null)
out.write("<br />The blog " + uri.getKeyHash().toBase64() + " is not known locally. "
+ "Please get it from an archive and <a href=\""
+ getPageURL(uri.getKeyHash(), null, uri.getEntryId(), -1, -1, user.getShowExpanded(), user.getShowImages())
+ "\">try again</a>");
else
out.write("<br />The blog <a href=\""
+ getPageURL(uri.getKeyHash(), null, -1, -1, -1, user.getShowExpanded(), user.getShowImages())
+ "\">" + info.getProperty(BlogInfo.NAME) + "</a> is known, but the entry " + uri.getEntryId() + " is not. "
+ "Please get it from an archive and <a href=\""
+ getPageURL(uri.getKeyHash(), null, uri.getEntryId(), -1, -1, user.getShowExpanded(), user.getShowImages())
+ "\">try again</a>");
}
public void render(User user, Archive archive, EntryContainer entry, Writer out, boolean cutBody, boolean showImages) throws IOException {
if (entry == null)
return;
render(user, archive, entry, entry.getEntry().getText(), out, cutBody, showImages);
}
public void render(User user, Archive archive, EntryContainer entry, String rawSML, Writer out, boolean cutBody, boolean showImages) throws IOException {
_user = user;
_archive = archive;
_entry = entry;
_out = out;
_headers = new HashMap();
_preBodyBuffer = new StringBuffer(1024);
_bodyBuffer = new StringBuffer(1024);
_postBodyBuffer = new StringBuffer(1024);
_addresses = new ArrayList();
_links = new ArrayList();
_blogs = new ArrayList();
_archives = new ArrayList();
_cutBody = cutBody;
_showImages = showImages;
_cutReached = false;
_cutSize = 1024;
_parser.parse(rawSML, this);
_out.write(_preBodyBuffer.toString());
_out.write(_bodyBuffer.toString());
_out.write(_postBodyBuffer.toString());
//int len = _preBodyBuffer.length() + _bodyBuffer.length() + _postBodyBuffer.length();
//System.out.println("Wrote " + len);
}
public void receivePlain(String text) {
if (!continueBody()) { return; }
_bodyBuffer.append(sanitizeString(text));
}
public void receiveBold(String text) {
if (!continueBody()) { return; }
_bodyBuffer.append("<b>").append(sanitizeString(text)).append("</b>");
}
public void receiveItalic(String text) {
if (!continueBody()) { return; }
_bodyBuffer.append("<i>").append(sanitizeString(text)).append("</i>");
}
public void receiveUnderline(String text) {
if (!continueBody()) { return; }
_bodyBuffer.append("<u>").append(sanitizeString(text)).append("</u>");
}
public void receiveHR() {
if (!continueBody()) { return; }
_bodyBuffer.append("<hr />");
}
public void receiveH1(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h1>").append(sanitizeString(body)).append("</h1>");
}
public void receiveH2(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h2>").append(sanitizeString(body)).append("</h2>");
}
public void receiveH3(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h3>").append(sanitizeString(body)).append("</h3>");
}
public void receiveH4(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h4>").append(sanitizeString(body)).append("</h4>");
}
public void receiveH5(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<h5>").append(sanitizeString(body)).append("</h5>");
}
public void receivePre(String body) {
if (!continueBody()) { return; }
_bodyBuffer.append("<pre>").append(sanitizeString(body)).append("</pre>");
}
public void receiveQuote(String text, String whoQuoted, String quoteLocationSchema, String quoteLocation) {
if (!continueBody()) { return; }
_bodyBuffer.append("<quote>").append(sanitizeString(text)).append("</quote>");
}
public void receiveCode(String text, String codeLocationSchema, String codeLocation) {
if (!continueBody()) { return; }
_bodyBuffer.append("<code>").append(sanitizeString(text)).append("</code>");
}
public void receiveImage(String alternateText, int attachmentId) {
if (!continueBody()) { return; }
if (_showImages) {
_bodyBuffer.append("<img src=\"").append(getAttachmentURL(attachmentId)).append("\"");
if (alternateText != null)
_bodyBuffer.append(" alt=\"").append(sanitizeTagParam(alternateText)).append("\"");
_bodyBuffer.append(" />");
} else {
_bodyBuffer.append("[image: attachment ").append(attachmentId);
_bodyBuffer.append(": ").append(sanitizeString(alternateText));
_bodyBuffer.append(" <a href=\"").append(getEntryURL(true)).append("\">view images</a>]");
}
}
public void receiveCut(String summaryText) {
if (!continueBody()) { return; }
_cutReached = true;
if (_cutBody) {
_bodyBuffer.append("<a href=\"").append(getEntryURL()).append("\">");
if ( (summaryText != null) && (summaryText.length() > 0) )
_bodyBuffer.append(sanitizeString(summaryText));
else
_bodyBuffer.append("more inside...");
_bodyBuffer.append("</a>\n");
} else {
if (summaryText != null)
_bodyBuffer.append(sanitizeString(summaryText));
}
}
/** are we either before the cut or rendering without cutting? */
protected boolean continueBody() {
boolean rv = ( (!_cutReached) && (_bodyBuffer.length() <= _cutSize) ) || (!_cutBody);
//if (!rv)
// System.out.println("rv: " + rv + " Cut reached: " + _cutReached + " bodyBufferSize: " + _bodyBuffer.length() + " cutBody? " + _cutBody);
if (!rv && !_cutReached) {
// exceeded the allowed size
_bodyBuffer.append("<a href=\"").append(getEntryURL()).append("\">more inside...</a>");
_cutReached = true;
}
return rv;
}
public void receiveNewline() {
if (!continueBody()) { return; }
if (true || (_lastNewlineAt >= _bodyBuffer.length()))
_bodyBuffer.append("<br />\n");
else
_lastNewlineAt = _bodyBuffer.length();
}
public void receiveLT() {
if (!continueBody()) { return; }
_bodyBuffer.append("&lt;");
}
public void receiveGT() {
if (!continueBody()) { return; }
_bodyBuffer.append("&gt;");
}
public void receiveBegin() {}
public void receiveLeftBracket() {
if (!continueBody()) { return; }
_bodyBuffer.append('[');
}
public void receiveRightBracket() {
if (!continueBody()) { return; }
_bodyBuffer.append(']');
}
protected static class Blog {
public String name;
public String hash;
public String tag;
public long entryId;
public List locations;
public int hashCode() { return -1; }
public boolean equals(Object o) {
Blog b = (Blog)o;
return DataHelper.eq(hash, b.hash) && DataHelper.eq(tag, b.tag) && DataHelper.eq(name, b.name)
&& DataHelper.eq(entryId, b.entryId) && DataHelper.eq(locations, b.locations);
}
}
/**
* when we see a link to a blog, we may want to:
* = view the blog entry
* = view all entries in that blog
* = view all entries in that blog with the given tag
* = view the blog's metadata
* = [fetch the blog from other locations]
* = [add the blog's locations to our list of known locations]
* = [shitlist the blog]
* = [add the blog to one of our groups]
*
* [blah] implies *later*.
*
* Currently renders to:
* <a href="$entryURL">$description</a>
* [blog: <a href="$blogURL">$name</a> (<a href="$metaURL">meta</a>)
* [tag: <a href="$blogTagURL">$tag</a>]
* archived at $location*]
*
*/
public void receiveBlog(String name, String hash, String tag, long entryId, List locations, String description) {
System.out.println("Receiving the blog: " + name + "/" + hash + "/" + tag + "/" + entryId +"/" + locations + ": "+ description);
byte blogData[] = Base64.decode(hash);
if ( (blogData == null) || (blogData.length != Hash.HASH_LENGTH) )
return;
Blog b = new Blog();
b.name = name;
b.hash = hash;
b.tag = tag;
b.entryId = entryId;
b.locations = locations;
if (!_blogs.contains(b))
_blogs.add(b);
if (!continueBody()) { return; }
if (hash == null) return;
Hash blog = new Hash(blogData);
if (entryId > 0) {
String pageURL = getPageURL(blog, tag, entryId, -1, -1, true, (_user != null ? _user.getShowImages() : false));
_bodyBuffer.append("<a href=\"").append(pageURL).append("\">");
if ( (description != null) && (description.trim().length() > 0) ) {
_bodyBuffer.append(sanitizeString(description));
} else if ( (name != null) && (name.trim().length() > 0) ) {
_bodyBuffer.append(sanitizeString(name));
} else {
_bodyBuffer.append("[view entry]");
}
_bodyBuffer.append("</a>");
}
String url = getPageURL(blog, null, -1, -1, -1, (_user != null ? _user.getShowExpanded() : false), (_user != null ? _user.getShowImages() : false));
_bodyBuffer.append(" [<a href=\"").append(url);
_bodyBuffer.append("\">");
if ( (name != null) && (name.trim().length() > 0) )
_bodyBuffer.append(sanitizeString(name));
else
_bodyBuffer.append("view");
_bodyBuffer.append("</a> (<a href=\"").append(getMetadataURL(blog)).append("\">meta</a>)");
if ( (tag != null) && (tag.trim().length() > 0) ) {
url = getPageURL(blog, tag, -1, -1, -1, false, false);
_bodyBuffer.append(" <a href=\"").append(url);
_bodyBuffer.append("\">Tag: ").append(sanitizeString(tag)).append("</a>");
}
if ( (locations != null) && (locations.size() > 0) ) {
_bodyBuffer.append(" Archives: ");
for (int i = 0; i < locations.size(); i++) {
SafeURL surl = (SafeURL)locations.get(i);
if (_user.getAuthenticated() && _user.getAllowAccessRemote())
_bodyBuffer.append("<a href=\"").append(getArchiveURL(blog, surl)).append("\">").append(sanitizeString(surl.toString())).append("</a> ");
else
_bodyBuffer.append(sanitizeString(surl.toString())).append(' ');
}
}
_bodyBuffer.append("] ");
}
protected static class ArchiveRef {
public String name;
public String description;
public String locationSchema;
public String location;
public int hashCode() { return -1; }
public boolean equals(Object o) {
ArchiveRef a = (ArchiveRef)o;
return DataHelper.eq(name, a.name) && DataHelper.eq(description, a.description)
&& DataHelper.eq(locationSchema, a.locationSchema)
&& DataHelper.eq(location, a.location);
}
}
public void receiveArchive(String name, String description, String locationSchema, String location,
String postingKey, String anchorText) {
ArchiveRef a = new ArchiveRef();
a.name = name;
a.description = description;
a.locationSchema = locationSchema;
a.location = location;
if (!_archives.contains(a))
_archives.add(a);
if (!continueBody()) { return; }
_bodyBuffer.append(sanitizeString(anchorText)).append(" [Archive ");
if (name != null)
_bodyBuffer.append(sanitizeString(name));
if (location != null) {
_bodyBuffer.append(" at ");
SafeURL surl = new SafeURL(locationSchema + "://" + location);
_bodyBuffer.append("<a href=\"").append(getArchiveURL(null, surl));
_bodyBuffer.append("\">").append(sanitizeString(surl.toString())).append("</a>");
}
if (description != null)
_bodyBuffer.append(": ").append(sanitizeString(description));
_bodyBuffer.append("]");
}
protected static class Link {
public String schema;
public String location;
public int hashCode() { return -1; }
public boolean equals(Object o) {
Link l = (Link)o;
return DataHelper.eq(schema, l.schema) && DataHelper.eq(location, l.location);
}
}
public void receiveLink(String schema, String location, String text) {
Link l = new Link();
l.schema = schema;
l.location = location;
if (!_links.contains(l))
_links.add(l);
if (!continueBody()) { return; }
if ( (schema == null) || (location == null) ) return;
_bodyBuffer.append("<a href=\"externallink.jsp?schema=");
_bodyBuffer.append(sanitizeURL(schema)).append("&location=");
_bodyBuffer.append(sanitizeURL(location)).append("&description=");
_bodyBuffer.append(sanitizeURL(text)).append("\">").append(sanitizeString(text)).append("</a>");
}
protected static class Address {
public String name;
public String schema;
public String location;
public int hashCode() { return -1; }
public boolean equals(Object o) {
Address a = (Address)o;
return DataHelper.eq(schema, a.schema) && DataHelper.eq(location, a.location) && DataHelper.eq(name, a.name);
}
}
public void receiveAddress(String name, String schema, String location, String anchorText) {
Address a = new Address();
a.name = name;
a.schema = schema;
a.location = location;
if (!_addresses.contains(a))
_addresses.add(a);
if (!continueBody()) { return; }
if ( (schema == null) || (location == null) ) return;
_bodyBuffer.append("<a href=\"addaddress.jsp?schema=");
_bodyBuffer.append(sanitizeURL(schema)).append("&name=");
_bodyBuffer.append(sanitizeURL(name)).append("&location=");
_bodyBuffer.append(sanitizeURL(location)).append("\">").append(sanitizeString(anchorText)).append("</a>");
}
public void receiveAttachment(int id, String anchorText) {
if (!continueBody()) { return; }
Attachment attachments[] = _entry.getAttachments();
if ( (id < 0) || (id >= attachments.length)) {
_bodyBuffer.append(sanitizeString(anchorText));
} else {
_bodyBuffer.append("<a href=\"").append(getAttachmentURL(id)).append("\">");
_bodyBuffer.append(sanitizeString(anchorText)).append("</a>");
_bodyBuffer.append(" (").append(attachments[id].getDataLength()/1024).append("KB, ");
_bodyBuffer.append(" \"").append(sanitizeString(attachments[id].getName())).append("\", ");
_bodyBuffer.append(sanitizeString(attachments[id].getMimeType())).append(")");
}
}
public void receiveEnd() {
_postBodyBuffer.append("</td></tr>\n");
if (_cutBody) {
_postBodyBuffer.append("<tr class=\"syndieEntryAttachmentsCell\">\n");
_postBodyBuffer.append("<td colspan=\"2\" valign=\"top\" align=\"left\" class=\"syndieEntryAttachmentsCell\">");
_postBodyBuffer.append("<a href=\"").append(getEntryURL()).append("\">View details...</a> ");
if ( (_entry != null) && (_entry.getAttachments() != null) && (_entry.getAttachments().length > 0) ) {
int num = _entry.getAttachments().length;
if (num == 1)
_postBodyBuffer.append("1 attachment ");
else
_postBodyBuffer.append(num + " attachments ");
}
int blogs = _blogs.size();
if (blogs == 1)
_postBodyBuffer.append("1 blog reference ");
else if (blogs > 1)
_postBodyBuffer.append(blogs).append(" blog references ");
int links = _links.size();
if (links == 1)
_postBodyBuffer.append("1 external link ");
else if (links > 1)
_postBodyBuffer.append(links).append(" external links");
int addrs = _addresses.size();
if (addrs == 1)
_postBodyBuffer.append("1 address ");
else if (addrs > 1)
_postBodyBuffer.append(addrs).append(" addresses ");
int archives = _archives.size();
if (archives == 1)
_postBodyBuffer.append("1 archive ");
else if (archives > 1)
_postBodyBuffer.append(archives).append(" archives ");
if (_entry != null) {
List replies = _archive.getIndex().getReplies(_entry.getURI());
if ( (replies != null) && (replies.size() > 0) ) {
if (replies.size() == 1)
_postBodyBuffer.append("1 reply ");
else
_postBodyBuffer.append(replies.size()).append(" replies ");
}
}
String inReplyTo = (String)_headers.get(HEADER_IN_REPLY_TO);
if ( (inReplyTo != null) && (inReplyTo.trim().length() > 0) )
_postBodyBuffer.append(" <a href=\"").append(getPageURL(sanitizeTagParam(inReplyTo))).append("\">(view parent)</a>\n");
_postBodyBuffer.append("</td></tr>\n");
} else {
_postBodyBuffer.append("<tr class=\"syndieEntryAttachmentsCell\">\n");
_postBodyBuffer.append("<form action=\"").append(getAttachmentURLBase()).append("\">\n");
_postBodyBuffer.append("<input type=\"hidden\" name=\"").append(ArchiveViewerBean.PARAM_BLOG);
_postBodyBuffer.append("\" value=\"");
if (_entry != null)
_postBodyBuffer.append(Base64.encode(_entry.getURI().getKeyHash().getData()));
else
_postBodyBuffer.append("unknown");
_postBodyBuffer.append("\" />\n");
_postBodyBuffer.append("<input type=\"hidden\" name=\"").append(ArchiveViewerBean.PARAM_ENTRY);
_postBodyBuffer.append("\" value=\"");
if (_entry != null)
_postBodyBuffer.append(_entry.getURI().getEntryId());
else
_postBodyBuffer.append("unknown");
_postBodyBuffer.append("\" />\n");
_postBodyBuffer.append("<td colspan=\"2\" valign=\"top\" align=\"left\" class=\"syndieEntryAttachmentsCell\">\n");
if ( (_entry != null) && (_entry.getAttachments() != null) && (_entry.getAttachments().length > 0) ) {
_postBodyBuffer.append("<b>Attachments:</b> ");
_postBodyBuffer.append("<select name=\"").append(ArchiveViewerBean.PARAM_ATTACHMENT).append("\">\n");
for (int i = 0; i < _entry.getAttachments().length; i++) {
_postBodyBuffer.append("<option value=\"").append(i).append("\">");
Attachment a = _entry.getAttachments()[i];
_postBodyBuffer.append(sanitizeString(a.getName()));
if ( (a.getDescription() != null) && (a.getDescription().trim().length() > 0) ) {
_postBodyBuffer.append(": ");
_postBodyBuffer.append(sanitizeString(a.getDescription()));
}
_postBodyBuffer.append(" (").append(a.getDataLength()/1024).append("KB");
_postBodyBuffer.append(", type ").append(sanitizeString(a.getMimeType())).append(")</option>\n");
}
_postBodyBuffer.append("</select>\n");
_postBodyBuffer.append("<input type=\"submit\" value=\"Download\" name=\"Download\" /><br />\n");
}
if (_blogs.size() > 0) {
_postBodyBuffer.append("<b>Blog references:</b> ");
for (int i = 0; i < _blogs.size(); i++) {
Blog b = (Blog)_blogs.get(i);
_postBodyBuffer.append("<a href=\"").append(getPageURL(new Hash(Base64.decode(b.hash)), b.tag, b.entryId, -1, -1, (_user != null ? _user.getShowExpanded() : false), (_user != null ? _user.getShowImages() : false)));
_postBodyBuffer.append("\">").append(sanitizeString(b.name)).append("</a> ");
}
_postBodyBuffer.append("<br />\n");
}
if (_links.size() > 0) {
_postBodyBuffer.append("<b>External links:</b> ");
for (int i = 0; i < _links.size(); i++) {
Link l = (Link)_links.get(i);
_postBodyBuffer.append("<a href=\"externallink.jsp?schema=");
_postBodyBuffer.append(sanitizeURL(l.schema)).append("&location=");
_postBodyBuffer.append(sanitizeURL(l.location));
_postBodyBuffer.append("\">").append(sanitizeString(l.location));
_postBodyBuffer.append(" (").append(sanitizeString(l.schema)).append(")</a> ");
}
_postBodyBuffer.append("<br />\n");
}
if (_addresses.size() > 0) {
_postBodyBuffer.append("<b>Addresses:</b>");
for (int i = 0; i < _addresses.size(); i++) {
Address a = (Address)_addresses.get(i);
_postBodyBuffer.append(" <a href=\"addaddress.jsp?schema=");
_postBodyBuffer.append(sanitizeURL(a.schema)).append("&location=");
_postBodyBuffer.append(sanitizeURL(a.location)).append("&name=");
_postBodyBuffer.append(sanitizeURL(a.name));
_postBodyBuffer.append("\">").append(sanitizeString(a.name));
}
_postBodyBuffer.append("<br />\n");
}
if (_archives.size() > 0) {
_postBodyBuffer.append("<b>Archives:</b>");
for (int i = 0; i < _archives.size(); i++) {
ArchiveRef a = (ArchiveRef)_archives.get(i);
_postBodyBuffer.append(" <a href=\"").append(getArchiveURL(null, new SafeURL(a.locationSchema + "://" + a.location)));
_postBodyBuffer.append("\">").append(sanitizeString(a.name)).append("</a>");
if (a.description != null)
_postBodyBuffer.append(": ").append(sanitizeString(a.description));
}
_postBodyBuffer.append("<br />\n");
}
if (_entry != null) {
List replies = _archive.getIndex().getReplies(_entry.getURI());
if ( (replies != null) && (replies.size() > 0) ) {
_postBodyBuffer.append("<b>Replies:</b> ");
for (int i = 0; i < replies.size(); i++) {
BlogURI reply = (BlogURI)replies.get(i);
_postBodyBuffer.append("<a href=\"");
_postBodyBuffer.append(getPageURL(reply.getKeyHash(), null, reply.getEntryId(), -1, -1, true, _user.getShowImages()));
_postBodyBuffer.append("\">");
BlogInfo replyAuthor = _archive.getBlogInfo(reply);
if (replyAuthor != null) {
_postBodyBuffer.append(sanitizeString(replyAuthor.getProperty(BlogInfo.NAME)));
} else {
_postBodyBuffer.append(reply.getKeyHash().toBase64().substring(0,16));
}
_postBodyBuffer.append(" on ");
_postBodyBuffer.append(getEntryDate(reply.getEntryId()));
_postBodyBuffer.append("</a> ");
}
_postBodyBuffer.append("<br />");
}
}
String inReplyTo = (String)_headers.get(HEADER_IN_REPLY_TO);
if ( (inReplyTo != null) && (inReplyTo.trim().length() > 0) ) {
_postBodyBuffer.append(" <a href=\"").append(getPageURL(sanitizeTagParam(inReplyTo))).append("\">(view parent)</a><br />\n");
}
_postBodyBuffer.append("</td>\n</form>\n</tr>\n");
}
_postBodyBuffer.append("</table>\n");
}
public void receiveHeader(String header, String value) {
//System.err.println("Receive header [" + header + "] = [" + value + "]");
_headers.put(header, value);
}
public void receiveHeaderEnd() {
_preBodyBuffer.append("<table width=\"100%\" border=\"0\">\n");
renderSubjectCell();
renderMetaCell();
renderPreBodyCell();
}
public static final String HEADER_SUBJECT = "Subject";
public static final String HEADER_BGCOLOR = "bgcolor";
public static final String HEADER_IN_REPLY_TO = "InReplyTo";
private void renderSubjectCell() {
_preBodyBuffer.append("<tr class=\"syndieEntrySubjectCell\"><td align=\"left\" valign=\"top\" class=\"syndieEntrySubjectCell\" width=\"400\"> ");
String subject = (String)_headers.get(HEADER_SUBJECT);
if (subject == null)
subject = "[no subject]";
_preBodyBuffer.append(sanitizeString(subject));
_preBodyBuffer.append("</td>\n");
}
private void renderPreBodyCell() {
String bgcolor = (String)_headers.get(HEADER_BGCOLOR);
if (_cutBody)
_preBodyBuffer.append("<tr class=\"syndieEntrySummaryCell\"><td colspan=\"2\" align=\"left\" valign=\"top\" class=\"syndieEntrySummaryCell\" " + (bgcolor != null ? "bgcolor=\"" + sanitizeTagParam(bgcolor) + "\"" : "") + "\">");
else
_preBodyBuffer.append("<tr class=\"syndieEntryBodyCell\"><td colspan=\"2\" align=\"left\" valign=\"top\" class=\"syndieEntryBodyCell\" " + (bgcolor != null ? "bgcolor=\"" + sanitizeTagParam(bgcolor) + "\"" : "") + "\">");
}
private void renderMetaCell() {
String tags[] = (_entry != null ? _entry.getTags() : null);
if ( (tags != null) && (tags.length > 0) )
_preBodyBuffer.append("<form action=\"index.jsp\">");
_preBodyBuffer.append("<td nowrap=\"true\" align=\"right\" valign=\"top\" class=\"syndieEntryMetaCell\">\n");
BlogInfo info = null;
if (_entry != null)
info = _archive.getBlogInfo(_entry.getURI());
if (info != null) {
_preBodyBuffer.append("<a href=\"").append(getMetadataURL()).append("\">");
String nameStr = info.getProperty("Name");
if (nameStr == null)
_preBodyBuffer.append("[no name]");
else
_preBodyBuffer.append(sanitizeString(nameStr));
_preBodyBuffer.append("</a>");
} else {
_preBodyBuffer.append("[unknown blog]");
}
if ( (tags != null) && (tags.length > 0) ) {
_preBodyBuffer.append(" Tags: ");
_preBodyBuffer.append("<select name=\"selector\">");
for (int i = 0; tags != null && i < tags.length; i++) {
_preBodyBuffer.append("<option value=\"blogtag://");
_preBodyBuffer.append(_entry.getURI().getKeyHash().toBase64());
_preBodyBuffer.append('/').append(Base64.encode(DataHelper.getUTF8(tags[i]))).append("\">");
_preBodyBuffer.append(sanitizeString(tags[i]));
_preBodyBuffer.append("</option>\n");
/*
_preBodyBuffer.append("<a href=\"");
_preBodyBuffer.append(getPageURL(_entry.getURI().getKeyHash(), tags[i], -1, -1, -1, (_user != null ? _user.getShowExpanded() : false), (_user != null ? _user.getShowImages() : false)));
_preBodyBuffer.append("\">");
_preBodyBuffer.append(sanitizeString(tags[i]));
_preBodyBuffer.append("</a>");
if (i + 1 < tags.length)
_preBodyBuffer.append(", ");
*/
}
_preBodyBuffer.append("</select>");
_preBodyBuffer.append("<input type=\"submit\" value=\"View\" />\n");
//_preBodyBuffer.append("</i>");
}
_preBodyBuffer.append(" ");
/*
String inReplyTo = (String)_headers.get(HEADER_IN_REPLY_TO);
if ( (inReplyTo != null) && (inReplyTo.trim().length() > 0) )
_preBodyBuffer.append(" <a href=\"").append(getPageURL(sanitizeTagParam(inReplyTo))).append("\">In reply to</a>\n");
*/
if (_entry != null)
_preBodyBuffer.append(getEntryDate(_entry.getURI().getEntryId()));
else
_preBodyBuffer.append(getEntryDate(new Date().getTime()));
if ( (_user != null) && (_user.getAuthenticated()) )
_preBodyBuffer.append(" <a href=\"").append(getPostURL(_user.getBlog(), true)).append("\">Reply</a>\n");
_preBodyBuffer.append("\n</td>");
if ( (tags != null) && (tags.length > 0) )
_preBodyBuffer.append("</form>");
_preBodyBuffer.append("</tr>\n");
}
private final SimpleDateFormat _dateFormat = new SimpleDateFormat("yyyy/MM/dd", Locale.UK);
private final String getEntryDate(long when) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(when));
long dayBegin = _dateFormat.parse(str).getTime();
return str + "." + (when - dayBegin);
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return "unknown";
}
}
}
public static final String sanitizeString(String str) { return sanitizeString(str, true); }
public static final String sanitizeString(String str, boolean allowNL) {
if (str == null) return null;
boolean unsafe = false;
unsafe = unsafe || str.indexOf('<') >= 0;
unsafe = unsafe || str.indexOf('>') >= 0;
if (!allowNL) {
unsafe = unsafe || str.indexOf('\n') >= 0;
unsafe = unsafe || str.indexOf('\r') >= 0;
unsafe = unsafe || str.indexOf('\f') >= 0;
}
if (!unsafe) return str;
str = str.replace('<', '_'); // this should be &lt;
str = str.replace('>', '-'); // this should be &gt;
if (!allowNL) {
str = str.replace('\n', ' ');
str = str.replace('\r', ' ');
str = str.replace('\f', ' ');
}
return str;
}
public static final String sanitizeURL(String str) { return Base64.encode(DataHelper.getUTF8(str)); }
public static final String sanitizeTagParam(String str) {
str = str.replace('&', '_'); // this should be &amp;
if (str.indexOf('\"') < 0)
return sanitizeString(str);
str = str.replace('\"', '\'');
return sanitizeString(str);
}
private String getEntryURL() { return getEntryURL(_user != null ? _user.getShowImages() : false); }
private String getEntryURL(boolean showImages) {
if (_entry == null) return "unknown";
return "index.jsp?" + ArchiveViewerBean.PARAM_BLOG + "=" +
Base64.encode(_entry.getURI().getKeyHash().getData()) +
"&" + ArchiveViewerBean.PARAM_ENTRY + "=" + _entry.getURI().getEntryId() +
"&" + ArchiveViewerBean.PARAM_SHOW_IMAGES + (showImages ? "=true" : "=false") +
"&" + ArchiveViewerBean.PARAM_EXPAND_ENTRIES + "=true";
}
protected String getAttachmentURLBase() { return "viewattachment.jsp"; }
protected String getAttachmentURL(int id) {
if (_entry == null) return "unknown";
return getAttachmentURLBase() + "?" +
ArchiveViewerBean.PARAM_BLOG + "=" +
Base64.encode(_entry.getURI().getKeyHash().getData()) +
"&" + ArchiveViewerBean.PARAM_ENTRY + "=" + _entry.getURI().getEntryId() +
"&" + ArchiveViewerBean.PARAM_ATTACHMENT + "=" + id;
}
public String getMetadataURL() {
if (_entry == null) return "unknown";
return getMetadataURL(_entry.getURI().getKeyHash());
}
public static String getMetadataURL(Hash blog) {
return "viewmetadata.jsp?" + ArchiveViewerBean.PARAM_BLOG + "=" +
Base64.encode(blog.getData());
}
public static String getPostURL(Hash blog) {
return "post.jsp?" + ArchiveViewerBean.PARAM_BLOG + "=" + Base64.encode(blog.getData());
}
public String getPostURL(Hash blog, boolean asReply) {
if (asReply && _entry != null) {
return "post.jsp?" + ArchiveViewerBean.PARAM_BLOG + "=" + Base64.encode(blog.getData())
+ "&" + ArchiveViewerBean.PARAM_IN_REPLY_TO + '='
+ Base64.encode("entry://" + _entry.getURI().getKeyHash().toBase64() + "/" + _entry.getURI().getEntryId());
} else {
return getPostURL(blog);
}
}
public String getPageURL(String selector) { return getPageURL(_user, selector); }
public static String getPageURL(User user, String selector) { return getPageURL(user, selector, -1, -1); }
public static String getPageURL(User user, String selector, int numPerPage, int pageNum) {
StringBuffer buf = new StringBuffer(128);
buf.append("index.jsp?");
buf.append("selector=").append(sanitizeTagParam(selector)).append("&");
if ( (pageNum >= 0) && (numPerPage > 0) ) {
buf.append(ArchiveViewerBean.PARAM_PAGE_NUMBER).append('=').append(pageNum).append('&');
buf.append(ArchiveViewerBean.PARAM_NUM_PER_PAGE).append('=').append(numPerPage).append('&');
}
buf.append(ArchiveViewerBean.PARAM_EXPAND_ENTRIES).append('=').append(user.getShowExpanded()).append('&');
buf.append(ArchiveViewerBean.PARAM_SHOW_IMAGES).append('=').append(user.getShowImages()).append('&');
return buf.toString();
}
public static String getPageURL(Hash blog, String tag, long entryId, int numPerPage, int pageNum, boolean expandEntries, boolean showImages) {
return getPageURL(blog, tag, entryId, null, numPerPage, pageNum, expandEntries, showImages);
}
public static String getPageURL(Hash blog, String tag, long entryId, String group, int numPerPage, int pageNum, boolean expandEntries, boolean showImages) {
StringBuffer buf = new StringBuffer(128);
buf.append("index.jsp?");
if (blog != null)
buf.append(ArchiveViewerBean.PARAM_BLOG).append('=').append(Base64.encode(blog.getData())).append('&');
if (tag != null)
buf.append(ArchiveViewerBean.PARAM_TAG).append('=').append(Base64.encode(DataHelper.getUTF8(tag))).append('&');
if (entryId >= 0)
buf.append(ArchiveViewerBean.PARAM_ENTRY).append('=').append(entryId).append('&');
if (group != null)
buf.append(ArchiveViewerBean.PARAM_GROUP).append('=').append(Base64.encode(DataHelper.getUTF8(group))).append('&');
if ( (pageNum >= 0) && (numPerPage > 0) ) {
buf.append(ArchiveViewerBean.PARAM_PAGE_NUMBER).append('=').append(pageNum).append('&');
buf.append(ArchiveViewerBean.PARAM_NUM_PER_PAGE).append('=').append(numPerPage).append('&');
}
buf.append(ArchiveViewerBean.PARAM_EXPAND_ENTRIES).append('=').append(expandEntries).append('&');
buf.append(ArchiveViewerBean.PARAM_SHOW_IMAGES).append('=').append(showImages).append('&');
return buf.toString();
}
public static String getArchiveURL(Hash blog, SafeURL archiveLocation) {
return "remote.jsp?"
//+ "action=Continue..." // should this be the case?
+ "&schema=" + sanitizeTagParam(archiveLocation.getSchema())
+ "&location=" + sanitizeTagParam(archiveLocation.getLocation());
}
}

View File

@@ -0,0 +1,442 @@
package net.i2p.syndie.sml;
import java.lang.String;
import java.util.*;
import net.i2p.syndie.data.*;
/**
* Parse out the SML from the text, firing off info to the receiver whenever certain
* elements are available. This is a very simple parser, with no support for nested
* tags. A simple stack would be good to add, but DTSTTCPW.
*
*
*/
public class SMLParser {
private static final char TAG_BEGIN = '[';
private static final char TAG_END = ']';
private static final char LT = '<';
private static final char GT = '>';
private static final char EQ = '=';
private static final char DQUOTE = '"';
private static final char QUOTE = '\'';
private static final String WHITESPACE = " \t\n\r";
private static final char NL = '\n';
private static final char CR = '\n';
private static final char LF = '\f';
public void parse(String rawSML, EventReceiver receiver) {
receiver.receiveBegin();
int off = 0;
off = parseHeaders(rawSML, off, receiver);
receiver.receiveHeaderEnd();
parseBody(rawSML, off, receiver);
receiver.receiveEnd();
}
private int parseHeaders(String rawSML, int off, EventReceiver receiver) {
if (rawSML == null) return off;
int len = rawSML.length();
if (len == off) return off;
int keyBegin = off;
int valBegin = -1;
while (off < len) {
char c = rawSML.charAt(off);
if ( (c == ':') && (valBegin < 0) ) {
// moving on to the value
valBegin = off + 1;
} else if (c == '\n') {
if (valBegin < 0) {
// end of the headers
off++;
break;
} else {
String key = rawSML.substring(keyBegin, valBegin-1);
String val = rawSML.substring(valBegin, off);
receiver.receiveHeader(key.trim(), val.trim());
valBegin = -1;
keyBegin = off + 1;
}
}
off++;
}
if ( (off >= len) && (valBegin > 0) ) {
String key = rawSML.substring(keyBegin, valBegin-1);
String val = rawSML.substring(valBegin, len);
receiver.receiveHeader(key.trim(), val.trim());
}
return off;
}
private void parseBody(String rawSMLBody, int off, EventReceiver receiver) {
if (rawSMLBody == null) return;
int begin = off;
int len = rawSMLBody.length();
if (len <= off) return;
int openTagBegin = -1;
int openTagEnd = -1;
int closeTagBegin = -1;
int closeTagEnd = -1;
while (off < len) {
char c = rawSMLBody.charAt(off);
if ( (c == NL) || (c == CR) || (c == LF) ) {
if (openTagBegin < 0) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveNewline();
off++;
begin = off;
continue;
} else {
// ignore NL inside a tag or between tag blocks
}
} else if (c == TAG_BEGIN) {
if ( (off + 1 < len) && (TAG_BEGIN == rawSMLBody.charAt(off+1))) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveLeftBracket();
off += 2;
begin = off;
continue;
} else if (openTagBegin < 0) {
// push everything seen and not accounted for into a plain area
if (closeTagEnd < 0) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
} else {
if (closeTagEnd + 1 < off)
receiver.receivePlain(rawSMLBody.substring(closeTagEnd+1, off));
}
openTagBegin = off;
closeTagBegin = -1;
begin = off + 1;
} else {
// ok, we are at the end of the tag, process it
closeTagBegin = off;
while ( (c != TAG_END) && (off < len) ) {
off++;
c = rawSMLBody.charAt(off);
}
parseTag(rawSMLBody, openTagBegin, openTagEnd, closeTagBegin, off, receiver);
begin = off + 1;
openTagBegin = -1;
openTagEnd = -1;
closeTagBegin = -1;
closeTagEnd = -1;
}
} else if (c == TAG_END) {
if ( (openTagBegin > 0) && (closeTagBegin < 0) ) {
openTagEnd = off;
} else if ( (off + 1 < len) && (TAG_END == rawSMLBody.charAt(off+1))) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveRightBracket();
off += 2;
begin = off;
continue;
}
} else if (c == LT) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveLT();
off++;
begin = off;
continue;
} else if (c == GT) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
receiver.receiveGT();
off++;
begin = off;
continue;
}
off++;
}
if ( (off >= len) && (openTagBegin < 0) ) {
if (closeTagEnd < 0) {
if (begin < off)
receiver.receivePlain(rawSMLBody.substring(begin, off));
} else {
if (closeTagEnd + 1 < off)
receiver.receivePlain(rawSMLBody.substring(closeTagEnd+1, off));
}
}
}
private void parseTag(String source, int openTagBegin, int openTagEnd, int closeTagBegin, int closeTagEnd, EventReceiver receiver) {
String tagName = getTagName(source, openTagBegin+1);
Map attributes = getAttributes(source, openTagBegin+1+tagName.length(), openTagEnd);
String body = null;
if (openTagEnd + 1 >= closeTagBegin)
body = "";
else
body = source.substring(openTagEnd+1, closeTagBegin);
//System.out.println("Receiving tag [" + tagName + "] w/ open [" + source.substring(openTagBegin+1, openTagEnd)
// + "], close [" + source.substring(closeTagBegin+1, closeTagEnd) + "] body ["
// + body + "] attributes: " + attributes);
parseTag(tagName, attributes, body, receiver);
}
private static final String T_BOLD = "b";
private static final String T_ITALIC = "i";
private static final String T_UNDERLINE = "u";
private static final String T_CUT = "cut";
private static final String T_IMAGE = "img";
private static final String T_QUOTE = "quote";
private static final String T_CODE = "code";
private static final String T_BLOG = "blog";
private static final String T_LINK = "link";
private static final String T_ADDRESS = "address";
private static final String T_H1 = "h1";
private static final String T_H2 = "h2";
private static final String T_H3 = "h3";
private static final String T_H4 = "h4";
private static final String T_H5 = "h5";
private static final String T_HR = "hr";
private static final String T_PRE = "pre";
private static final String T_ATTACHMENT = "attachment";
private static final String T_ARCHIVE = "archive";
private static final String P_ATTACHMENT = "attachment";
private static final String P_WHO_QUOTED = "author";
private static final String P_QUOTE_LOCATION = "location";
private static final String P_CODE_LOCATION = "location";
private static final String P_BLOG_NAME = "name";
private static final String P_BLOG_HASH = "bloghash";
private static final String P_BLOG_TAG = "blogtag";
private static final String P_BLOG_ENTRY = "blogentry";
private static final String P_LINK_LOCATION = "location";
private static final String P_LINK_SCHEMA = "schema";
private static final String P_ADDRESS_NAME = "name";
private static final String P_ADDRESS_LOCATION = "location";
private static final String P_ADDRESS_SCHEMA = "schema";
private static final String P_ATTACHMENT_ID = "id";
private static final String P_ARCHIVE_NAME = "name";
private static final String P_ARCHIVE_DESCRIPTION = "description";
private static final String P_ARCHIVE_LOCATION_SCHEMA = "schema";
private static final String P_ARCHIVE_LOCATION = "location";
private static final String P_ARCHIVE_POSTING_KEY = "postingkey";
private void parseTag(String tagName, Map attr, String body, EventReceiver receiver) {
tagName = tagName.toLowerCase();
if (T_BOLD.equals(tagName)) {
receiver.receiveBold(body);
} else if (T_ITALIC.equals(tagName)) {
receiver.receiveItalic(body);
} else if (T_UNDERLINE.equals(tagName)) {
receiver.receiveUnderline(body);
} else if (T_CUT.equals(tagName)) {
receiver.receiveCut(body);
} else if (T_IMAGE.equals(tagName)) {
receiver.receiveImage(body, getInt(P_ATTACHMENT, attr));
} else if (T_QUOTE.equals(tagName)) {
receiver.receiveQuote(body, getString(P_WHO_QUOTED, attr), getSchema(P_QUOTE_LOCATION, attr), getLocation(P_QUOTE_LOCATION, attr));
} else if (T_CODE.equals(tagName)) {
receiver.receiveCode(body, getSchema(P_CODE_LOCATION, attr), getLocation(P_CODE_LOCATION, attr));
} else if (T_BLOG.equals(tagName)) {
List locations = new ArrayList();
int i = 0;
while (true) {
String s = getString("archive" + i, attr);
if (s != null)
locations.add(new SafeURL(s));
else
break;
i++;
}
receiver.receiveBlog(getString(P_BLOG_NAME, attr), getString(P_BLOG_HASH, attr), getString(P_BLOG_TAG, attr),
getLong(P_BLOG_ENTRY, attr), locations, body);
} else if (T_ARCHIVE.equals(tagName)) {
receiver.receiveArchive(getString(P_ARCHIVE_NAME, attr), getString(P_ARCHIVE_DESCRIPTION, attr),
getString(P_ARCHIVE_LOCATION_SCHEMA, attr), getString(P_ARCHIVE_LOCATION, attr),
getString(P_ARCHIVE_POSTING_KEY, attr), body);
} else if (T_LINK.equals(tagName)) {
receiver.receiveLink(getString(P_LINK_SCHEMA, attr), getString(P_LINK_LOCATION, attr), body);
} else if (T_ADDRESS.equals(tagName)) {
receiver.receiveAddress(getString(P_ADDRESS_NAME, attr), getString(P_ADDRESS_SCHEMA, attr), getString(P_ADDRESS_LOCATION, attr), body);
} else if (T_H1.equals(tagName)) {
receiver.receiveH1(body);
} else if (T_H2.equals(tagName)) {
receiver.receiveH2(body);
} else if (T_H3.equals(tagName)) {
receiver.receiveH3(body);
} else if (T_H4.equals(tagName)) {
receiver.receiveH4(body);
} else if (T_H5.equals(tagName)) {
receiver.receiveH5(body);
} else if (T_HR.equals(tagName)) {
receiver.receiveHR();
} else if (T_PRE.equals(tagName)) {
receiver.receivePre(body);
} else if (T_ATTACHMENT.equals(tagName)) {
receiver.receiveAttachment((int)getLong(P_ATTACHMENT_ID, attr), body);
} else {
System.out.println("need to learn how to parse the tag [" + tagName + "]");
}
}
private String getString(String param, Map attributes) { return (String)attributes.get(param); }
private String getSchema(String param, Map attributes) {
String url = getString(param, attributes);
if (url != null) {
SafeURL u = new SafeURL(url);
return u.getSchema();
} else {
return null;
}
}
private String getLocation(String param, Map attributes) {
String url = getString(param, attributes);
if (url != null) {
SafeURL u = new SafeURL(url);
return u.getLocation();
} else {
return null;
}
}
private int getInt(String attributeName, Map attributes) {
String val = (String)attributes.get(attributeName.toLowerCase());
if (val != null) {
try {
return Integer.parseInt(val.trim());
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return -1;
}
} else {
return -1;
}
}
private long getLong(String attributeName, Map attributes) {
String val = (String)attributes.get(attributeName.toLowerCase());
if (val != null) {
try {
return Long.parseLong(val.trim());
} catch (NumberFormatException nfe) {
nfe.printStackTrace();
return -1;
}
} else {
return -1;
}
}
private String getTagName(String source, int nameStart) {
int off = nameStart;
while (true) {
char c = source.charAt(off);
if ( (c == TAG_END) || (WHITESPACE.indexOf(c) >= 0) )
return source.substring(nameStart, off);
off++;
}
}
private Map getAttributes(String source, int attributesStart, int openTagEnd) {
Map rv = new HashMap();
int off = attributesStart;
int nameStart = -1;
int nameEnd = -1;
int valStart = -1;
int valEnd = -1;
while (true) {
char c = source.charAt(off);
if ( (c == TAG_END) || (off >= openTagEnd) )
break;
if (WHITESPACE.indexOf(c) < 0) {
if (nameStart < 0) {
nameStart = off;
} else if (c == EQ) {
if (nameEnd < 0)
nameEnd = off;
} else if ( (c == QUOTE) || (c == DQUOTE) ) {
if (valStart < 0) {
valStart = off;
} else {
valEnd = off;
String name = source.substring(nameStart, nameEnd);
String val = source.substring(valStart+1, valEnd);
rv.put(name.trim(), val.trim());
nameStart = -1;
nameEnd = -1;
valStart = -1;
valEnd = -1;
}
}
}
off++;
}
return rv;
}
public interface EventReceiver {
public void receiveHeader(String header, String value);
public void receiveLink(String schema, String location, String text);
/** @param blogArchiveLocations list of SafeURL */
public void receiveBlog(String name, String blogKeyHash, String blogPath, long blogEntryId,
List blogArchiveLocations, String anchorText);
public void receiveArchive(String name, String description, String locationSchema, String location,
String postingKey, String anchorText);
public void receiveImage(String alternateText, int attachmentId);
public void receiveAddress(String name, String schema, String location, String anchorText);
public void receiveAttachment(int id, String anchorText);
public void receiveBold(String text);
public void receiveItalic(String text);
public void receiveUnderline(String text);
public void receiveH1(String text);
public void receiveH2(String text);
public void receiveH3(String text);
public void receiveH4(String text);
public void receiveH5(String text);
public void receivePre(String text);
public void receiveHR();
public void receiveQuote(String text, String whoQuoted, String quoteLocationSchema, String quoteLocation);
public void receiveCode(String text, String codeLocationSchema, String codeLocation);
public void receiveCut(String summaryText);
public void receivePlain(String text);
public void receiveNewline();
public void receiveLT();
public void receiveGT();
public void receiveLeftBracket();
public void receiveRightBracket();
public void receiveBegin();
public void receiveEnd();
public void receiveHeaderEnd();
}
public static void main(String args[]) {
test(null);
test("");
test("A: B");
test("A: B\n");
test("A: B\nC: D");
test("A: B\nC: D\n");
test("A: B\nC: D\n\n");
test("A: B\nC: D\n\nblah");
test("A: B\nC: D\n\nblah[[");
test("A: B\nC: D\n\nblah]]");
test("A: B\nC: D\n\nblah]]blah");
test("A: B\nC: D\n\nfoo[a]b[/a]bar");
test("A: B\nC: D\n\nfoo[a]b[/a]bar[b][/b]");
test("A: B\nC: D\n\nfoo[a]b[/a]bar[b][/b]baz");
test("A: B\nC: D\n\n<a href=\"http://odci.gov\">hi</a>");
test("A: B\n\n[a b='c']d[/a]");
test("A: B\n\n[a b='c' d='e' f='g']h[/a]");
test("A: B\n\n[a b='c' d='e' f='g']h[/a][a b='c' d='e' f='g']h[/a][a b='c' d='e' f='g']h[/a]");
test("A: B\n\n[a b='c' ]d[/a]");
test("A: B\n\n[a b=\"c\" ]d[/a]");
test("A: B\n\n[b]This[/b] is [i]special[/i][cut]why?[/cut][u]because I say so[/u].\neven if you dont care");
}
private static void test(String rawSML) {
SMLParser parser = new SMLParser();
parser.parse(rawSML, new EventReceiverImpl());
}
}

View File

@@ -0,0 +1,183 @@
package net.i2p.syndie.web;
import java.io.*;
import java.util.*;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.ServletException;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
/**
*
*/
public class ArchiveServlet extends HttpServlet {
public void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
String path = req.getPathInfo();
if ( (path == null) || (path.trim().length() <= 1) ) {
renderRootIndex(resp);
return;
} else if (path.endsWith(Archive.INDEX_FILE)) {
renderSummary(resp);
} else if (path.endsWith("export.zip")) {
ExportServlet.export(req, resp);
} else {
String blog = getBlog(path);
if (path.endsWith(Archive.METADATA_FILE)) {
renderMetadata(blog, resp);
} else if (path.endsWith(".snd")) {
renderEntry(blog, getEntry(path), resp);
} else {
renderBlogIndex(blog, resp);
}
}
}
private String getBlog(String path) {
//System.err.println("Blog: [" + path + "]");
int start = 0;
int end = -1;
int len = path.length();
for (int i = 0; i < len; i++) {
if (path.charAt(i) != '/') {
start = i;
break;
}
}
for (int j = start + 1; j < len; j++) {
if (path.charAt(j) == '/') {
end = j;
break;
}
}
if (end < 0) end = len;
String rv = path.substring(start, end);
//System.err.println("Blog: [" + path + "] rv: [" + rv + "]");
return rv;
}
private long getEntry(String path) {
int start = path.lastIndexOf('/');
if (start < 0) return -1;
if (!(path.endsWith(".snd"))) return -1;
String rv = path.substring(start+1, path.length()-".snd".length());
//System.err.println("Entry: [" + path + "] rv: [" + rv + "]");
try {
return Long.parseLong(rv);
} catch (NumberFormatException nfe) {
return -1;
}
}
private void renderRootIndex(HttpServletResponse resp) throws ServletException, IOException {
resp.setContentType("text/html;charset=utf-8");
//resp.setCharacterEncoding("UTF-8");
OutputStream out = resp.getOutputStream();
out.write(DataHelper.getUTF8("<a href=\"archive.txt\">archive.txt</a><br />\n"));
ArchiveIndex index = BlogManager.instance().getArchive().getIndex();
Set blogs = index.getUniqueBlogs();
for (Iterator iter = blogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
String s = blog.toBase64();
out.write(DataHelper.getUTF8("<a href=\"" + s + "/\">" + s + "</a><br />\n"));
}
out.close();
}
private void renderSummary(HttpServletResponse resp) throws ServletException, IOException {
resp.setContentType("text/plain;charset=utf-8");
//resp.setCharacterEncoding("UTF-8");
OutputStream out = resp.getOutputStream();
ArchiveIndex index = BlogManager.instance().getArchive().getIndex();
out.write(DataHelper.getUTF8(index.toString()));
out.close();
}
private void renderMetadata(String blog, HttpServletResponse resp) throws ServletException, IOException {
byte b[] = Base64.decode(blog);
if ( (b == null) || (b.length != Hash.HASH_LENGTH) ) {
resp.sendError(404, "Invalid blog requested");
return;
}
Hash h = new Hash(b);
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(h);
if (info == null) {
resp.sendError(404, "Blog does not exist");
return;
}
resp.setContentType("application/x-syndie-meta");
OutputStream out = resp.getOutputStream();
info.write(out);
out.close();
}
private void renderBlogIndex(String blog, HttpServletResponse resp) throws ServletException, IOException {
byte b[] = Base64.decode(blog);
if ( (b == null) || (b.length != Hash.HASH_LENGTH) ) {
resp.sendError(404, "Invalid blog requested");
return;
}
Hash h = new Hash(b);
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(h);
if (info == null) {
resp.sendError(404, "Blog does not exist");
return;
}
resp.setContentType("text/html;charset=utf-8");
//resp.setCharacterEncoding("UTF-8");
OutputStream out = resp.getOutputStream();
out.write(DataHelper.getUTF8("<a href=\"..\">..</a><br />\n"));
out.write(DataHelper.getUTF8("<a href=\"" + Archive.METADATA_FILE + "\">" + Archive.METADATA_FILE + "</a><br />\n"));
List entries = new ArrayList(64);
BlogManager.instance().getArchive().getIndex().selectMatchesOrderByEntryId(entries, h, null);
for (int i = 0; i < entries.size(); i++) {
BlogURI entry = (BlogURI)entries.get(i);
out.write(DataHelper.getUTF8("<a href=\"" + entry.getEntryId() + ".snd\">" + entry.getEntryId() + ".snd</a><br />\n"));
}
out.close();
}
private void renderEntry(String blog, long entryId, HttpServletResponse resp) throws ServletException, IOException {
byte b[] = Base64.decode(blog);
if ( (b == null) || (b.length != Hash.HASH_LENGTH) ) {
resp.sendError(404, "Invalid blog requested");
return;
}
Hash h = new Hash(b);
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(h);
if (info == null) {
resp.sendError(404, "Blog does not exist");
return;
}
File root = BlogManager.instance().getArchive().getArchiveDir();
File blogDir = new File(root, blog);
if (!blogDir.exists()) {
resp.sendError(404, "Blog does not exist");
return;
}
File entry = new File(blogDir, entryId + ".snd");
if (!entry.exists()) {
resp.sendError(404, "Entry does not exist");
return;
}
resp.setContentType("application/x-syndie-post");
dump(entry, resp);
}
private void dump(File source, HttpServletResponse resp) throws ServletException, IOException {
FileInputStream in = new FileInputStream(source);
OutputStream out = resp.getOutputStream();
byte buf[] = new byte[1024];
int read = 0;
while ( (read = in.read(buf)) != -1)
out.write(buf, 0, read);
out.close();
in.close();
}
}

View File

@@ -0,0 +1,625 @@
package net.i2p.syndie.web;
import java.io.*;
import java.text.*;
import java.util.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
/**
*
*/
public class ArchiveViewerBean {
public static String getBlogName(String keyHash) {
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(new Hash(Base64.decode(keyHash)));
if (info == null)
return HTMLRenderer.sanitizeString(keyHash);
else
return HTMLRenderer.sanitizeString(info.getProperty("Name"));
}
public static String getEntryTitle(String keyHash, long entryId) {
String name = getBlogName(keyHash);
return getEntryTitleDate(name, entryId);
}
private static final SimpleDateFormat _dateFormat = new SimpleDateFormat("yyyy/MM/dd", Locale.UK);
public static final String getEntryTitleDate(String blogName, long when) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(when));
long dayBegin = _dateFormat.parse(str).getTime();
return blogName + ":<br /> <i>" + str + "-" + (when - dayBegin) + "</i>";
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return "unknown";
}
}
}
/** base64 encoded hash of the blog's public key, or null for no filtering by blog */
public static final String PARAM_BLOG = "blog";
/** base64 encoded tag to filter by, or blank for no filtering by tags */
public static final String PARAM_TAG = "tag";
/** entry id within the blog if we only want to see that one */
public static final String PARAM_ENTRY = "entry";
/** base64 encoded group within the user's filters */
public static final String PARAM_GROUP = "group";
/** how many entries per page to show at once */
public static final String PARAM_NUM_PER_PAGE = "pageSize";
/** which page of entries to render */
public static final String PARAM_PAGE_NUMBER = "pageNum";
/** should we expand each entry to show the full contents */
public static final String PARAM_EXPAND_ENTRIES = "expand";
/** should entries be rendered with the images shown inline */
public static final String PARAM_SHOW_IMAGES = "images";
/** should we regenerate an index to the archive before rendering */
public static final String PARAM_REGENERATE_INDEX = "regenerateIndex";
/** which attachment should we serve up raw */
public static final String PARAM_ATTACHMENT = "attachment";
/** we are replying to a particular blog/tag/entry/whatever (value == base64 encoded selector) */
public static final String PARAM_IN_REPLY_TO = "inReplyTo";
/**
* Drop down multichooser:
* blog://base64(key)
* tag://base64(tag)
* blogtag://base64(key)/base64(tag)
* entry://base64(key)/entryId
* group://base64(groupName)
* ALL
*/
public static final String PARAM_SELECTOR = "selector";
public static final String SEL_ALL = "ALL";
public static final String SEL_BLOG = "blog://";
public static final String SEL_TAG = "tag://";
public static final String SEL_BLOGTAG = "blogtag://";
public static final String SEL_ENTRY = "entry://";
public static final String SEL_GROUP = "group://";
/** submit field for the selector form */
public static final String PARAM_SELECTOR_ACTION = "action";
public static final String SEL_ACTION_SET_AS_DEFAULT = "Set as default";
public static void renderBlogSelector(User user, Map parameters, Writer out) throws IOException {
String sel = getString(parameters, PARAM_SELECTOR);
String action = getString(parameters, PARAM_SELECTOR_ACTION);
if ( (sel != null) && (action != null) && (SEL_ACTION_SET_AS_DEFAULT.equals(action)) ) {
user.setDefaultSelector(HTMLRenderer.sanitizeString(sel, false));
BlogManager.instance().saveUser(user);
}
out.write("<select name=\"");
out.write(PARAM_SELECTOR);
out.write("\">");
out.write("<option value=\"");
out.write(getDefaultSelector(user, parameters));
out.write("\">Default blog filter</option>\n");
out.write("\">");
out.write("<option value=\"");
out.write(SEL_ALL);
out.write("\">All posts from all blogs</option>\n");
Map groups = null;
if (user != null)
groups = user.getBlogGroups();
if (groups != null) {
for (Iterator iter = groups.keySet().iterator(); iter.hasNext(); ) {
String name = (String)iter.next();
out.write("<option value=\"group://" + Base64.encode(DataHelper.getUTF8(name)) + "\">" +
"Group: " + HTMLRenderer.sanitizeString(name) + "</option>\n");
}
}
Archive archive = BlogManager.instance().getArchive();
ArchiveIndex index = archive.getIndex();
for (int i = 0; i < index.getNewestBlogCount(); i++) {
Hash cur = index.getNewestBlog(i);
String blog = Base64.encode(cur.getData());
out.write("<option value=\"blog://" + blog + "\">");
out.write("New blog: ");
BlogInfo info = archive.getBlogInfo(cur);
String name = info.getProperty(BlogInfo.NAME);
if (name != null)
name = HTMLRenderer.sanitizeString(name);
else
name = Base64.encode(cur.getData());
out.write(name);
out.write("</option>\n");
}
List allTags = new ArrayList();
// perhaps sort this by name (even though it isnt unique...)
Set blogs = index.getUniqueBlogs();
for (Iterator iter = blogs.iterator(); iter.hasNext(); ) {
Hash cur = (Hash)iter.next();
String blog = Base64.encode(cur.getData());
out.write("<option value=\"blog://");
out.write(blog);
out.write("\">");
BlogInfo info = archive.getBlogInfo(cur);
String name = info.getProperty(BlogInfo.NAME);
if (name != null)
name = HTMLRenderer.sanitizeString(name);
else
name = Base64.encode(cur.getData());
out.write(name);
out.write("- all posts</option>\n");
List tags = index.getBlogTags(cur);
for (int j = 0; j < tags.size(); j++) {
String tag = (String)tags.get(j);
if (false) {
StringBuffer b = new StringBuffer(tag.length()*2);
for (int k = 0; k < tag.length(); k++) {
b.append((int)tag.charAt(k));
b.append(' ');
}
System.out.println("tag in select: " + tag + ": " + b.toString());
}
if (!allTags.contains(tag))
allTags.add(tag);
out.write("<option value=\"blogtag://");
out.write(blog);
out.write("/");
byte utf8tag[] = DataHelper.getUTF8(tag);
String encoded = Base64.encode(utf8tag);
if (false) {
byte utf8dec[] = Base64.decode(encoded);
String travel = DataHelper.getUTF8(utf8dec);
StringBuffer b = new StringBuffer();
for (int k = 0; k < travel.length(); k++) {
b.append((int)travel.charAt(k));
b.append(' ');
}
b.append(" encoded into: ");
for (int k = 0; k < encoded.length(); k++) {
b.append((int)encoded.charAt(k));
b.append(' ');
}
System.out.println("UTF8(unbase64(base64(UTF8(tag)))) == tag: " + b.toString());
}
out.write(encoded);
out.write("\">");
out.write(name);
out.write("- posts with the tag &quot;");
out.write(tag);
out.write("&quot;</option>\n");
}
}
for (int i = 0; i < allTags.size(); i++) {
String tag = (String)allTags.get(i);
out.write("<option value=\"tag://");
out.write(Base64.encode(DataHelper.getUTF8(tag)));
out.write("\">Posts in any blog with the tag &quot;");
out.write(tag);
out.write("&quot;</option>\n");
}
out.write("</select>");
int numPerPage = getInt(parameters, PARAM_NUM_PER_PAGE, 5);
int pageNum = getInt(parameters, PARAM_PAGE_NUMBER, 0);
boolean expandEntries = getBool(parameters, PARAM_EXPAND_ENTRIES, (user != null ? user.getShowExpanded() : false));
boolean showImages = getBool(parameters, PARAM_SHOW_IMAGES, (user != null ? user.getShowImages() : false));
out.write("<input type=\"hidden\" name=\"" + PARAM_NUM_PER_PAGE+ "\" value=\"" + numPerPage+ "\" />");
out.write("<input type=\"hidden\" name=\"" + PARAM_PAGE_NUMBER+ "\" value=\"" + pageNum+ "\" />");
out.write("<input type=\"hidden\" name=\"" + PARAM_EXPAND_ENTRIES+ "\" value=\"" + expandEntries+ "\" />");
out.write("<input type=\"hidden\" name=\"" + PARAM_SHOW_IMAGES + "\" value=\"" + showImages + "\" />");
}
private static String getDefaultSelector(User user, Map parameters) {
if ( (user == null) || (user.getDefaultSelector() == null) )
return BlogManager.instance().getArchive().getDefaultSelector();
else
return user.getDefaultSelector();
}
public static void renderBlogs(User user, Map parameters, Writer out, String afterPagination) throws IOException {
String blogStr = getString(parameters, PARAM_BLOG);
Hash blog = null;
if (blogStr != null) blog = new Hash(Base64.decode(blogStr));
String tag = getString(parameters, PARAM_TAG);
if (tag != null) tag = DataHelper.getUTF8(Base64.decode(tag));
long entryId = -1;
if (blogStr != null) {
String entryIdStr = getString(parameters, PARAM_ENTRY);
try {
entryId = Long.parseLong(entryIdStr);
} catch (NumberFormatException nfe) {}
}
String group = getString(parameters, PARAM_GROUP);
if (group != null) group = DataHelper.getUTF8(Base64.decode(group));
String sel = getString(parameters, PARAM_SELECTOR);
if ( (sel == null) && (blog == null) && (group == null) && (tag == null) )
sel = getDefaultSelector(user, parameters);
if (sel != null) {
Selector s = new Selector(sel);
blog = s.blog;
tag = s.tag;
entryId = s.entry;
group = s.group;
}
int numPerPage = getInt(parameters, PARAM_NUM_PER_PAGE, 5);
int pageNum = getInt(parameters, PARAM_PAGE_NUMBER, 0);
boolean expandEntries = getBool(parameters, PARAM_EXPAND_ENTRIES, (user != null ? user.getShowExpanded() : false));
boolean showImages = getBool(parameters, PARAM_SHOW_IMAGES, (user != null ? user.getShowImages() : false));
boolean regenerateIndex = getBool(parameters, PARAM_REGENERATE_INDEX, false);
try {
renderBlogs(user, blog, tag, entryId, group, numPerPage, pageNum, expandEntries, showImages, regenerateIndex, sel, out, afterPagination);
} catch (IOException ioe) {
ioe.printStackTrace();
throw ioe;
} catch (RuntimeException re) {
re.printStackTrace();
throw re;
}
}
public static class Selector {
public Hash blog;
public String tag;
public long entry;
public String group;
public Selector(String selector) {
entry = -1;
blog = null;
tag = null;
if (selector != null) {
if (selector.startsWith(SEL_BLOG)) {
String blogStr = selector.substring(SEL_BLOG.length());
System.out.println("Selector [" + selector + "] blogString: [" + blogStr + "]");
blog = new Hash(Base64.decode(blogStr));
} else if (selector.startsWith(SEL_BLOGTAG)) {
int tagStart = selector.lastIndexOf('/');
String blogStr = selector.substring(SEL_BLOGTAG.length(), tagStart);
blog = new Hash(Base64.decode(blogStr));
tag = selector.substring(tagStart+1);
String origTag = tag;
byte rawDecode[] = null;
if (tag != null) {
rawDecode = Base64.decode(tag);
tag = DataHelper.getUTF8(rawDecode);
}
System.out.println("Selector [" + selector + "] blogString: [" + blogStr + "] tag: [" + tag + "]");
if (false && tag != null) {
StringBuffer b = new StringBuffer(tag.length()*2);
for (int j = 0; j < tag.length(); j++) {
b.append((int)tag.charAt(j));
if (rawDecode.length > j)
b.append('.').append((int)rawDecode[j]);
b.append(' ');
}
b.append("encoded as ");
for (int j = 0; j < origTag.length(); j++) {
b.append((int)origTag.charAt(j)).append(' ');
}
System.out.println("selected tag: " + b.toString());
}
} else if (selector.startsWith(SEL_TAG)) {
tag = selector.substring(SEL_TAG.length());
byte rawDecode[] = null;
if (tag != null) {
rawDecode = Base64.decode(tag);
tag = DataHelper.getUTF8(rawDecode);
}
System.out.println("Selector [" + selector + "] tag: [" + tag + "]");
if (false && tag != null) {
StringBuffer b = new StringBuffer(tag.length()*2);
for (int j = 0; j < tag.length(); j++) {
b.append((int)tag.charAt(j));
if (rawDecode.length > j)
b.append('.').append((int)rawDecode[j]);
b.append(' ');
}
System.out.println("selected tag: " + b.toString());
}
} else if (selector.startsWith(SEL_ENTRY)) {
int entryStart = selector.lastIndexOf('/');
String blogStr = blogStr = selector.substring(SEL_ENTRY.length(), entryStart);
String entryStr = selector.substring(entryStart+1);
try {
entry = Long.parseLong(entryStr);
blog = new Hash(Base64.decode(blogStr));
System.out.println("Selector [" + selector + "] blogString: [" + blogStr + "] entry: [" + entry + "]");
} catch (NumberFormatException nfe) {}
} else if (selector.startsWith(SEL_GROUP)) {
group = DataHelper.getUTF8(Base64.decode(selector.substring(SEL_GROUP.length())));
System.out.println("Selector [" + selector + "] group: [" + group + "]");
}
}
}
}
private static void renderBlogs(User user, Hash blog, String tag, long entryId, String group, int numPerPage, int pageNum,
boolean expandEntries, boolean showImages, boolean regenerateIndex, String selector, Writer out, String afterPagination) throws IOException {
Archive archive = BlogManager.instance().getArchive();
if (regenerateIndex)
archive.regenerateIndex();
ArchiveIndex index = archive.getIndex();
List entries = pickEntryURIs(user, index, blog, tag, entryId, group);
System.out.println("Searching for " + blog + "/" + tag + "/" + entryId + "/" + pageNum + "/" + numPerPage + "/" + group);
System.out.println("Entry URIs: " + entries);
HTMLRenderer renderer = new HTMLRenderer();
int start = pageNum * numPerPage;
int end = start + numPerPage;
int pages = 1;
if (entries.size() <= 1) {
// just one, so no pagination, etc
start = 0;
end = 1;
} else {
if (end >= entries.size())
end = entries.size();
if ( (pageNum < 0) || (numPerPage <= 0) ) {
start = 0;
end = entries.size() - 1;
} else {
pages = entries.size() / numPerPage;
if (numPerPage * pages < entries.size())
pages++;
out.write("<i>");
if (pageNum > 0) {
String prevURL = null;
if ( (selector == null) || (selector.trim().length() <= 0) )
prevURL = HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum-1, expandEntries, showImages);
else
prevURL = HTMLRenderer.getPageURL(user, selector, numPerPage, pageNum-1);
System.out.println("prevURL: " + prevURL);
out.write(" <a href=\"" + prevURL + "\">&lt;&lt;</a>");
} else {
out.write(" &lt;&lt; ");
}
out.write("Page " + (pageNum+1) + " of " + pages);
if (pageNum + 1 < pages) {
String nextURL = null;
if ( (selector == null) || (selector.trim().length() <= 0) )
nextURL = HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum+1, expandEntries, showImages);
else
nextURL = HTMLRenderer.getPageURL(user, selector, numPerPage, pageNum+1);
System.out.println("nextURL: " + nextURL);
out.write(" <a href=\"" + nextURL + "\">&gt;&gt;</a>");
} else {
out.write(" &gt;&gt;");
}
out.write("</i>");
}
}
/*
out.write(" <i>");
if (showImages)
out.write("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum, expandEntries, false) +
"\">Hide images</a>");
else
out.write("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum, expandEntries, true) +
"\">Show images</a>");
if (expandEntries)
out.write(" <a href=\"" + HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum, false, showImages) +
"\">Hide details</a>");
else
out.write(" <a href=\"" + HTMLRenderer.getPageURL(blog, tag, entryId, group, numPerPage, pageNum, true, showImages) +
"\">Expand details</a>");
out.write("</i>");
*/
if (afterPagination != null)
out.write(afterPagination);
if (entries.size() <= 0) end = -1;
System.out.println("Entries.size: " + entries.size() + " start=" + start + " end=" + end);
for (int i = start; i < end; i++) {
BlogURI uri = (BlogURI)entries.get(i);
EntryContainer c = archive.getEntry(uri);
try {
if (c == null)
renderer.renderUnknownEntry(user, archive, uri, out);
else
renderer.render(user, archive, c, out, !expandEntries, showImages);
} catch (RuntimeException e) {
e.printStackTrace();
throw e;
}
}
}
private static List pickEntryURIs(User user, ArchiveIndex index, Hash blog, String tag, long entryId, String group) {
List rv = new ArrayList(16);
if ( (blog != null) && (entryId >= 0) ) {
rv.add(new BlogURI(blog, entryId));
return rv;
}
if ( (group != null) && (user != null) ) {
List selectors = (List)user.getBlogGroups().get(group);
if (selectors != null) {
System.out.println("Selectors for group " + group + ": " + selectors);
for (int i = 0; i < selectors.size(); i++) {
String sel = (String)selectors.get(i);
Selector s = new Selector(sel);
if ( (s.entry >= 0) && (s.blog != null) && (s.group == null) && (s.tag == null) )
rv.add(new BlogURI(s.blog, s.entry));
else
index.selectMatchesOrderByEntryId(rv, s.blog, s.tag);
}
return rv;
}
}
index.selectMatchesOrderByEntryId(rv, blog, tag);
return rv;
}
public static final String getString(Map parameters, String param) {
if ( (parameters == null) || (parameters.get(param) == null) )
return null;
Object vals = parameters.get(param);
if (vals.getClass().isArray()) {
String v[] = (String[])vals;
if (v.length > 0)
return ((String[])vals)[0];
else
return null;
} else if (vals instanceof Collection) {
Collection c = (Collection)vals;
if (c.size() > 0)
return (String)c.iterator().next();
else
return null;
} else {
return null;
}
}
public static final String[] getStrings(Map parameters, String param) {
if ( (parameters == null) || (parameters.get(param) == null) )
return null;
Object vals = parameters.get(param);
if (vals.getClass().isArray()) {
return (String[])vals;
} else if (vals instanceof Collection) {
Collection c = (Collection)vals;
if (c.size() <= 0) return null;
String rv[] = new String[c.size()];
int i = 0;
for (Iterator iter = c.iterator(); iter.hasNext(); i++)
rv[i] = (String)iter.next();
return rv;
} else {
return null;
}
}
private static final int getInt(Map param, String key, int defaultVal) {
String val = getString(param, key);
if (val != null) {
try { return Integer.parseInt(val); } catch (NumberFormatException nfe) {}
}
return defaultVal;
}
private static final boolean getBool(Map param, String key, boolean defaultVal) {
String val = getString(param, key);
if (val != null) {
return ("true".equals(val) || "yes".equals(val));
}
return defaultVal;
}
public static void renderAttachment(Map parameters, OutputStream out) throws IOException {
Attachment a = getAttachment(parameters);
if (a == null) {
renderInvalidAttachment(parameters, out);
} else {
InputStream data = a.getDataStream();
byte buf[] = new byte[1024];
int read = 0;
while ( (read = data.read(buf)) != -1)
out.write(buf, 0, read);
data.close();
}
}
public static final String getAttachmentContentType(Map parameters) {
Attachment a = getAttachment(parameters);
if (a == null)
return "text/html";
String mime = a.getMimeType();
if ( (mime != null) && ((mime.startsWith("image/") || mime.startsWith("text/plain"))) )
return mime;
return "application/octet-stream";
}
public static final int getAttachmentContentLength(Map parameters) {
Attachment a = getAttachment(parameters);
if (a != null)
return a.getDataLength();
else
return -1;
}
private static final Attachment getAttachment(Map parameters) {
String blogStr = getString(parameters, PARAM_BLOG);
Hash blog = null;
if (blogStr != null) blog = new Hash(Base64.decode(blogStr));
long entryId = -1;
if (blogStr != null) {
String entryIdStr = getString(parameters, PARAM_ENTRY);
try {
entryId = Long.parseLong(entryIdStr);
} catch (NumberFormatException nfe) {}
}
int attachment = getInt(parameters, PARAM_ATTACHMENT, -1);
Archive archive = BlogManager.instance().getArchive();
EntryContainer entry = archive.getEntry(new BlogURI(blog, entryId));
if ( (entry != null) && (attachment >= 0) && (attachment < entry.getAttachments().length) ) {
return entry.getAttachments()[attachment];
}
return null;
}
private static void renderInvalidAttachment(Map parameters, OutputStream out) throws IOException {
out.write(DataHelper.getUTF8("<b>No such entry, or no such attachment</b>"));
}
public static void renderMetadata(Map parameters, Writer out) throws IOException {
String blogStr = getString(parameters, PARAM_BLOG);
if (blogStr != null) {
Hash blog = new Hash(Base64.decode(blogStr));
Archive archive = BlogManager.instance().getArchive();
BlogInfo info = archive.getBlogInfo(blog);
if (info == null) {
out.write("Blog " + blog.toBase64() + " does not exist");
return;
}
String props[] = info.getProperties();
out.write("<table border=\"0\">");
for (int i = 0; i < props.length; i++) {
if (props[i].equals(BlogInfo.OWNER_KEY)) {
out.write("<tr><td><b>Blog:</b></td><td>");
String blogURL = HTMLRenderer.getPageURL(blog, null, -1, -1, -1, false, false);
out.write("<a href=\"" + blogURL + "\">" + Base64.encode(blog.getData()) + "</td></tr>\n");
} else if (props[i].equals(BlogInfo.SIGNATURE)) {
continue;
} else if (props[i].equals(BlogInfo.POSTERS)) {
SigningPublicKey keys[] = info.getPosters();
if ( (keys != null) && (keys.length > 0) ) {
out.write("<tr><td><b>Allowed authors:</b></td><td>");
for (int j = 0; j < keys.length; j++) {
out.write(keys[j].calculateHash().toBase64());
if (j + 1 < keys.length)
out.write("<br />\n");
}
out.write("</td></tr>\n");
}
} else {
out.write("<tr><td>" + HTMLRenderer.sanitizeString(props[i]) + ":</td><td>" +
HTMLRenderer.sanitizeString(info.getProperty(props[i])) + "</td></tr>\n");
}
}
List tags = BlogManager.instance().getArchive().getIndex().getBlogTags(blog);
if ( (tags != null) && (tags.size() > 0) ) {
out.write("<tr><td>Known tags:</td><td>");
for (int i = 0; i < tags.size(); i++) {
String tag = (String)tags.get(i);
out.write("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, -1, -1, -1, false, false) + "\">" +
HTMLRenderer.sanitizeString(tag) + "</a> ");
}
out.write("</td></tr>");
}
out.write("</table>");
} else {
out.write("Blog not specified");
}
}
}

View File

@@ -0,0 +1,100 @@
package net.i2p.syndie.web;
import java.io.*;
import java.io.*;
import java.util.*;
import java.util.zip.*;
import javax.servlet.http.HttpServlet;
import javax.servlet.http.HttpServletRequest;
import javax.servlet.http.HttpServletResponse;
import javax.servlet.ServletException;
import net.i2p.data.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.*;
/**
* Dump out a whole series of blog metadata and entries as a zip stream. All metadata
* is written before any entries, so it can be processed in order safely.
*
* HTTP parameters:
* = meta (multiple values): base64 hash of the blog for which metadata is requested
* = entry (multiple values): blog URI of an entry being requested
*/
public class ExportServlet extends HttpServlet {
public void doGet(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
export(req, resp);
}
public static void export(HttpServletRequest req, HttpServletResponse resp) throws ServletException, IOException {
String meta[] = req.getParameterValues("meta");
String entries[] = req.getParameterValues("entry");
resp.setContentType("application/x-syndie-zip");
resp.setStatus(200);
OutputStream out = resp.getOutputStream();
ZipOutputStream zo = new ZipOutputStream(out);
List metaFiles = getMetaFiles(meta);
ZipEntry ze = null;
byte buf[] = new byte[1024];
int read = -1;
for (int i = 0; metaFiles != null && i < metaFiles.size(); i++) {
ze = new ZipEntry("meta" + i);
ze.setTime(0);
zo.putNextEntry(ze);
FileInputStream in = new FileInputStream((File)metaFiles.get(i));
while ( (read = in.read(buf)) != -1)
zo.write(buf, 0, read);
zo.closeEntry();
}
List entryFiles = getEntryFiles(entries);
for (int i = 0; entryFiles != null && i < entryFiles.size(); i++) {
ze = new ZipEntry("entry" + i);
ze.setTime(0);
zo.putNextEntry(ze);
FileInputStream in = new FileInputStream((File)entryFiles.get(i));
while ( (read = in.read(buf)) != -1)
zo.write(buf, 0, read);
zo.closeEntry();
}
zo.finish();
zo.close();
}
private static List getMetaFiles(String blogHashes[]) {
if ( (blogHashes == null) || (blogHashes.length <= 0) ) return null;
File dir = BlogManager.instance().getArchive().getArchiveDir();
List rv = new ArrayList(blogHashes.length);
for (int i = 0; i < blogHashes.length; i++) {
byte hv[] = Base64.decode(blogHashes[i]);
if ( (hv == null) || (hv.length != Hash.HASH_LENGTH) )
continue;
File blogDir = new File(dir, blogHashes[i]);
File metaFile = new File(blogDir, Archive.METADATA_FILE);
if (metaFile.exists())
rv.add(metaFile);
}
return rv;
}
private static List getEntryFiles(String blogURIs[]) {
if ( (blogURIs == null) || (blogURIs.length <= 0) ) return null;
File dir = BlogManager.instance().getArchive().getArchiveDir();
List rv = new ArrayList(blogURIs.length);
for (int i = 0; i < blogURIs.length; i++) {
BlogURI uri = new BlogURI(blogURIs[i]);
if (uri.getEntryId() < 0)
continue;
File blogDir = new File(dir, uri.getKeyHash().toBase64());
File entryFile = new File(blogDir, uri.getEntryId() + ".snd");
if (entryFile.exists())
rv.add(entryFile);
}
return rv;
}
}

View File

@@ -0,0 +1,136 @@
package net.i2p.syndie.web;
import java.io.*;
import java.util.*;
import net.i2p.syndie.*;
import net.i2p.syndie.data.BlogURI;
import net.i2p.syndie.sml.HTMLPreviewRenderer;
/**
*
*/
public class PostBean {
private User _user;
private String _subject;
private String _tags;
private String _headers;
private String _text;
private List _filenames;
private List _fileStreams;
private List _localFiles;
private List _fileTypes;
private boolean _previewed;
public PostBean() { reinitialize(); }
public void reinitialize() {
System.out.println("Reinitializing " + (_text != null ? "(with " + _text.length() + " bytes of sml!)" : ""));
_user = null;
_subject = null;
_tags = null;
_text = null;
_headers = null;
_filenames = new ArrayList();
_fileStreams = new ArrayList();
_fileTypes = new ArrayList();
if (_localFiles != null)
for (int i = 0; i < _localFiles.size(); i++)
((File)_localFiles.get(i)).delete();
_localFiles = new ArrayList();
_previewed = false;
}
public User getUser() { return _user; }
public String getSubject() { return (_subject != null ? _subject : ""); }
public String getTags() { return (_tags != null ? _tags : ""); }
public String getText() { return (_text != null ? _text : ""); }
public String getHeaders() { return (_headers != null ? _headers : ""); }
public void setUser(User user) { _user = user; }
public void setSubject(String subject) { _subject = subject; }
public void setTags(String tags) { _tags = tags; }
public void setText(String text) { _text = text; }
public void setHeaders(String headers) { _headers = headers; }
public String getContentType(int id) {
if ( (id >= 0) && (id < _fileTypes.size()) )
return (String)_fileTypes.get(id);
return "application/octet-stream";
}
public void writeAttachmentData(int id, OutputStream out) throws IOException {
FileInputStream in = new FileInputStream((File)_localFiles.get(id));
byte buf[] = new byte[1024];
int read = 0;
while ( (read = in.read(buf)) != -1)
out.write(buf, 0, read);
out.close();
}
public void addAttachment(String filename, InputStream fileStream, String mimeType) {
_filenames.add(filename);
_fileStreams.add(fileStream);
_fileTypes.add(mimeType);
}
public int getAttachmentCount() { return (_filenames != null ? _filenames.size() : 0); }
public BlogURI postEntry() throws IOException {
if (!_previewed) return null;
List localStreams = new ArrayList(_localFiles.size());
for (int i = 0; i < _localFiles.size(); i++) {
File f = (File)_localFiles.get(i);
localStreams.add(new FileInputStream(f));
}
return BlogManager.instance().createBlogEntry(_user, _subject, _tags, _headers, _text,
_filenames, localStreams, _fileTypes);
}
public void renderPreview(Writer out) throws IOException {
System.out.println("Subject: " + _subject);
System.out.println("Text: " + _text);
System.out.println("Headers: " + _headers);
// cache all the _fileStreams into temporary files, storing those files in _localFiles
// then render the page accordingly with an HTMLRenderer, altered to use a different
// 'view attachment'
cacheAttachments();
String smlContent = renderSMLContent();
HTMLPreviewRenderer r = new HTMLPreviewRenderer(_filenames, _fileTypes, _localFiles);
r.render(_user, BlogManager.instance().getArchive(), null, smlContent, out, false, true);
_previewed = true;
}
private String renderSMLContent() {
StringBuffer raw = new StringBuffer();
raw.append("Subject: ").append(_subject).append('\n');
raw.append("Tags: ");
StringTokenizer tok = new StringTokenizer(_tags, " \t\n");
while (tok.hasMoreTokens())
raw.append(tok.nextToken()).append('\t');
raw.append('\n');
raw.append(_headers.trim());
raw.append("\n\n");
raw.append(_text.trim());
return raw.toString();
}
private void cacheAttachments() throws IOException {
File postCacheDir = new File(BlogManager.instance().getTempDir(), _user.getBlog().toBase64());
if (!postCacheDir.exists())
postCacheDir.mkdirs();
for (int i = 0; i < _fileStreams.size(); i++) {
InputStream in = (InputStream)_fileStreams.get(i);
File f = File.createTempFile("attachment", ".dat", postCacheDir);
FileOutputStream o = new FileOutputStream(f);
byte buf[] = new byte[1024];
int read = 0;
while ( (read = in.read(buf)) != -1)
o.write(buf, 0, read);
o.close();
in.close();
_localFiles.add(f);
System.out.println("Caching attachment " + i + " temporarily in "
+ f.getAbsolutePath() + " w/ " + f.length() + "bytes");
}
_fileStreams.clear();
}
}

View File

@@ -0,0 +1,656 @@
package net.i2p.syndie.web;
import java.io.*;
import java.text.*;
import java.util.*;
import java.util.zip.*;
import net.i2p.I2PAppContext;
import net.i2p.data.*;
import net.i2p.util.EepGet;
import net.i2p.util.EepGetScheduler;
import net.i2p.util.EepPost;
import net.i2p.syndie.data.*;
import net.i2p.syndie.sml.*;
import net.i2p.syndie.*;
/**
*
*/
public class RemoteArchiveBean {
private String _remoteSchema;
private String _remoteLocation;
private String _proxyHost;
private int _proxyPort;
private ArchiveIndex _remoteIndex;
private List _statusMessages;
private boolean _fetchIndexInProgress;
public RemoteArchiveBean() {
reinitialize();
}
public void reinitialize() {
_remoteSchema = null;
_remoteLocation = null;
_remoteIndex = null;
_fetchIndexInProgress = false;
_proxyHost = null;
_proxyPort = -1;
_statusMessages = new ArrayList();
}
public String getRemoteSchema() { return _remoteSchema; }
public String getRemoteLocation() { return _remoteLocation; }
public ArchiveIndex getRemoteIndex() { return _remoteIndex; }
public String getProxyHost() { return _proxyHost; }
public int getProxyPort() { return _proxyPort; }
public boolean getFetchIndexInProgress() { return _fetchIndexInProgress; }
public String getStatus() {
StringBuffer buf = new StringBuffer();
while (_statusMessages.size() > 0)
buf.append(_statusMessages.remove(0)).append("\n");
return buf.toString();
}
public void fetchMetadata(User user, Map parameters) {
String meta = ArchiveViewerBean.getString(parameters, "blog");
if (meta == null) return;
Set blogs = new HashSet();
if ("ALL".equals(meta)) {
Set localBlogs = BlogManager.instance().getArchive().getIndex().getUniqueBlogs();
Set remoteBlogs = _remoteIndex.getUniqueBlogs();
for (Iterator iter = remoteBlogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
if (!localBlogs.contains(blog)) {
blogs.add(blog);
}
}
} else {
blogs.add(new Hash(Base64.decode(meta.trim())));
}
List urls = new ArrayList(blogs.size());
List tmpFiles = new ArrayList(blogs.size());
for (Iterator iter = blogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
urls.add(buildMetaURL(blog));
try {
tmpFiles.add(File.createTempFile("fetchMeta", ".txt", BlogManager.instance().getTempDir()));
} catch (IOException ioe) {
_statusMessages.add("Internal error creating temporary file to fetch " + blog.toBase64() + ": " + ioe.getMessage());
}
}
for (int i = 0; i < urls.size(); i++)
_statusMessages.add("Scheduling up metadata fetches for " + HTMLRenderer.sanitizeString((String)urls.get(i)));
fetch(urls, tmpFiles, user, new MetadataStatusListener());
}
private String buildMetaURL(Hash blog) {
String loc = _remoteLocation.trim();
int root = loc.lastIndexOf('/');
return loc.substring(0, root + 1) + blog.toBase64() + "/" + Archive.METADATA_FILE;
}
public void fetchSelectedEntries(User user, Map parameters) {
String entries[] = ArchiveViewerBean.getStrings(parameters, "entry");
if ( (entries == null) || (entries.length <= 0) ) return;
List urls = new ArrayList(entries.length);
List tmpFiles = new ArrayList(entries.length);
for (int i = 0; i < entries.length; i++) {
urls.add(buildEntryURL(new BlogURI(entries[i])));
try {
tmpFiles.add(File.createTempFile("fetchBlog", ".txt", BlogManager.instance().getTempDir()));
} catch (IOException ioe) {
_statusMessages.add("Internal error creating temporary file to fetch " + HTMLRenderer.sanitizeString(entries[i]) + ": " + ioe.getMessage());
}
}
for (int i = 0; i < urls.size(); i++)
_statusMessages.add("Scheduling blog post fetching for " + HTMLRenderer.sanitizeString(entries[i]));
fetch(urls, tmpFiles, user, new BlogStatusListener());
}
public void fetchSelectedBulk(User user, Map parameters) {
String entries[] = ArchiveViewerBean.getStrings(parameters, "entry");
String action = ArchiveViewerBean.getString(parameters, "action");
if ("Fetch all new entries".equals(action)) {
ArchiveIndex localIndex = BlogManager.instance().getArchive().getIndex();
List uris = new ArrayList();
List matches = new ArrayList();
for (Iterator iter = _remoteIndex.getUniqueBlogs().iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
_remoteIndex.selectMatchesOrderByEntryId(matches, blog, null);
for (int i = 0; i < matches.size(); i++) {
BlogURI uri = (BlogURI)matches.get(i);
if (!localIndex.getEntryIsKnown(uri))
uris.add(uri);
}
matches.clear();
}
entries = new String[uris.size()];
for (int i = 0; i < uris.size(); i++)
entries[i] = ((BlogURI)uris.get(i)).toString();
}
if ( (entries == null) || (entries.length <= 0) ) return;
StringBuffer url = new StringBuffer(512);
url.append(buildExportURL());
Set meta = new HashSet();
for (int i = 0; i < entries.length; i++) {
BlogURI uri = new BlogURI(entries[i]);
if (uri.getEntryId() >= 0) {
url.append("entry=").append(uri.toString()).append('&');
meta.add(uri.getKeyHash());
_statusMessages.add("Scheduling blog post fetching for " + HTMLRenderer.sanitizeString(entries[i]));
}
}
for (Iterator iter = meta.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
url.append("meta=").append(blog.toBase64()).append('&');
_statusMessages.add("Scheduling blog metadata fetching for " + blog.toBase64());
}
List urls = new ArrayList(1);
urls.add(url.toString());
List tmpFiles = new ArrayList(1);
try {
File tmp = File.createTempFile("fetchBulk", ".zip", BlogManager.instance().getTempDir());
tmpFiles.add(tmp);
fetch(urls, tmpFiles, user, new BulkFetchListener(tmp));
} catch (IOException ioe) {
_statusMessages.add("Internal error creating temporary file to fetch " + HTMLRenderer.sanitizeString(url.toString()) + ": " + ioe.getMessage());
}
}
private String buildExportURL() {
String loc = _remoteLocation.trim();
int root = loc.lastIndexOf('/');
return loc.substring(0, root + 1) + "export.zip?";
}
private String buildEntryURL(BlogURI uri) {
String loc = _remoteLocation.trim();
int root = loc.lastIndexOf('/');
return loc.substring(0, root + 1) + uri.getKeyHash().toBase64() + "/" + uri.getEntryId() + ".snd";
}
public void fetchAllEntries(User user, Map parameters) {
ArchiveIndex localIndex = BlogManager.instance().getArchive().getIndex();
List uris = new ArrayList();
List entries = new ArrayList();
for (Iterator iter = _remoteIndex.getUniqueBlogs().iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
_remoteIndex.selectMatchesOrderByEntryId(entries, blog, null);
for (int i = 0; i < entries.size(); i++) {
BlogURI uri = (BlogURI)entries.get(i);
if (!localIndex.getEntryIsKnown(uri))
uris.add(uri);
}
entries.clear();
}
List urls = new ArrayList(uris.size());
List tmpFiles = new ArrayList(uris.size());
for (int i = 0; i < uris.size(); i++) {
urls.add(buildEntryURL((BlogURI)uris.get(i)));
try {
tmpFiles.add(File.createTempFile("fetchBlog", ".txt", BlogManager.instance().getTempDir()));
} catch (IOException ioe) {
_statusMessages.add("Internal error creating temporary file to fetch " + HTMLRenderer.sanitizeString(uris.get(i).toString()) + ": " + ioe.getMessage());
}
}
for (int i = 0; i < urls.size(); i++)
_statusMessages.add("Fetch all entries: " + HTMLRenderer.sanitizeString((String)urls.get(i)));
fetch(urls, tmpFiles, user, new BlogStatusListener());
}
private void fetch(List urls, List tmpFiles, User user, EepGet.StatusListener lsnr) {
EepGetScheduler scheduler = new EepGetScheduler(I2PAppContext.getGlobalContext(), urls, tmpFiles, _proxyHost, _proxyPort, lsnr);
scheduler.fetch();
}
public void fetchIndex(User user, String schema, String location, String proxyHost, String proxyPort) {
_fetchIndexInProgress = true;
_remoteIndex = null;
_remoteLocation = location;
_remoteSchema = schema;
_proxyHost = null;
_proxyPort = -1;
if ( (schema == null) || (schema.trim().length() <= 0) ||
(location == null) || (location.trim().length() <= 0) ) {
_statusMessages.add("Location must be specified");
_fetchIndexInProgress = false;
return;
}
if ("web".equals(schema)) {
if ( (proxyHost != null) && (proxyHost.trim().length() > 0) &&
(proxyPort != null) && (proxyPort.trim().length() > 0) ) {
_proxyHost = proxyHost;
try {
_proxyPort = Integer.parseInt(proxyPort);
} catch (NumberFormatException nfe) {
_statusMessages.add("Proxy port " + HTMLRenderer.sanitizeString(proxyPort) + " is invalid");
_fetchIndexInProgress = false;
return;
}
}
} else {
_statusMessages.add(new String("Remote schema " + HTMLRenderer.sanitizeString(schema) + " currently not supported"));
_fetchIndexInProgress = false;
return;
}
_statusMessages.add("Fetching index from " + HTMLRenderer.sanitizeString(_remoteLocation) +
(_proxyHost != null ? " via " + HTMLRenderer.sanitizeString(_proxyHost) + ":" + _proxyPort : ""));
File archiveFile = new File(BlogManager.instance().getTempDir(), user.getBlog().toBase64() + "_remoteArchive.txt");
archiveFile.delete();
EepGet eep = new EepGet(I2PAppContext.getGlobalContext(), ((_proxyHost != null) && (_proxyPort > 0)),
_proxyHost, _proxyPort, 0, archiveFile.getAbsolutePath(), location);
eep.addStatusListener(new IndexFetcherStatusListener(archiveFile));
eep.fetch();
}
private class IndexFetcherStatusListener implements EepGet.StatusListener {
private File _archiveFile;
public IndexFetcherStatusListener(File file) {
_archiveFile = file;
}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_statusMessages.add("Attempt " + currentAttempt + " failed after " + bytesTransferred + (cause != null ? cause.getMessage() : ""));
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " successful");
_fetchIndexInProgress = false;
ArchiveIndex i = new ArchiveIndex(false);
try {
i.load(_archiveFile);
_statusMessages.add("Archive fetched and loaded");
_remoteIndex = i;
} catch (IOException ioe) {
_statusMessages.add("Archive is corrupt: " + ioe.getMessage());
}
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " failed after " + bytesTransferred);
_fetchIndexInProgress = false;
}
}
private class MetadataStatusListener implements EepGet.StatusListener {
public MetadataStatusListener() {}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_statusMessages.add("Attempt " + currentAttempt + " failed after " + bytesTransferred + (cause != null ? cause.getMessage() : ""));
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " successful");
File info = new File(outputFile);
FileInputStream in = null;
try {
BlogInfo i = new BlogInfo();
in = new FileInputStream(info);
i.load(in);
boolean ok = BlogManager.instance().getArchive().storeBlogInfo(i);
if (ok) {
_statusMessages.add("Blog info for " + HTMLRenderer.sanitizeString(i.getProperty(BlogInfo.NAME)) + " imported");
BlogManager.instance().getArchive().reloadInfo();
} else {
_statusMessages.add("Blog info at " + HTMLRenderer.sanitizeString(url) + " was corrupt / invalid / forged");
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
info.delete();
}
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " failed after " + bytesTransferred);;
}
}
private class BlogStatusListener implements EepGet.StatusListener {
public BlogStatusListener() {}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_statusMessages.add("Attempt " + currentAttempt + " failed after " + bytesTransferred + (cause != null ? cause.getMessage() : ""));
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " successful");
File file = new File(outputFile);
FileInputStream in = null;
try {
EntryContainer c = new EntryContainer();
in = new FileInputStream(file);
c.load(in);
BlogURI uri = c.getURI();
if ( (uri == null) || (uri.getKeyHash() == null) ) {
_statusMessages.add("Blog post at " + HTMLRenderer.sanitizeString(url) + " was corrupt - no URI");
return;
}
Archive a = BlogManager.instance().getArchive();
BlogInfo info = a.getBlogInfo(uri);
if (info == null) {
_statusMessages.add("Blog post " + uri.toString() + " cannot be imported, as we don't have their blog metadata");
return;
}
boolean ok = a.storeEntry(c);
if (!ok) {
_statusMessages.add("Blog post at " + url + ": " + uri.toString() + " has an invalid signature");
return;
} else {
_statusMessages.add("Blog post " + uri.toString() + " imported");
BlogManager.instance().getArchive().regenerateIndex();
}
} catch (IOException ioe) {
ioe.printStackTrace();
} finally {
if (in != null) try { in.close(); } catch (IOException ioe) {}
file.delete();
}
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " failed after " + bytesTransferred);
}
}
/**
* Receive the status of a fetch for the zip containing blogs and metadata (as generated by
* the ExportServlet)
*/
private class BulkFetchListener implements EepGet.StatusListener {
private File _tmp;
public BulkFetchListener(File tmp) {
_tmp = tmp;
}
public void attemptFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt, int numRetries, Exception cause) {
_statusMessages.add("Attempt " + currentAttempt + " failed after " + bytesTransferred + (cause != null ? cause.getMessage() : ""));
}
public void bytesTransferred(long alreadyTransferred, int currentWrite, long bytesTransferred, long bytesRemaining, String url) {}
public void transferComplete(long alreadyTransferred, long bytesTransferred, long bytesRemaining, String url, String outputFile) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url.substring(0, url.indexOf('?'))) + " successful, importing the data");
File file = new File(outputFile);
ZipInputStream zi = null;
try {
zi = new ZipInputStream(new FileInputStream(file));
while (true) {
ZipEntry entry = zi.getNextEntry();
if (entry == null)
break;
ByteArrayOutputStream out = new ByteArrayOutputStream(1024);
byte buf[] = new byte[1024];
int read = -1;
while ( (read = zi.read(buf)) != -1)
out.write(buf, 0, read);
if (entry.getName().startsWith("meta")) {
BlogInfo i = new BlogInfo();
i.load(new ByteArrayInputStream(out.toByteArray()));
boolean ok = BlogManager.instance().getArchive().storeBlogInfo(i);
if (ok) {
_statusMessages.add("Blog info for " + HTMLRenderer.sanitizeString(i.getProperty(BlogInfo.NAME)) + " imported");
} else {
_statusMessages.add("Blog info at " + HTMLRenderer.sanitizeString(url) + " was corrupt / invalid / forged");
}
} else if (entry.getName().startsWith("entry")) {
EntryContainer c = new EntryContainer();
c.load(new ByteArrayInputStream(out.toByteArray()));
BlogURI uri = c.getURI();
if ( (uri == null) || (uri.getKeyHash() == null) ) {
_statusMessages.add("Blog post " + HTMLRenderer.sanitizeString(entry.getName()) + " was corrupt - no URI");
continue;
}
Archive a = BlogManager.instance().getArchive();
BlogInfo info = a.getBlogInfo(uri);
if (info == null) {
_statusMessages.add("Blog post " + HTMLRenderer.sanitizeString(entry.getName()) + " cannot be imported, as we don't have their blog metadata");
continue;
}
boolean ok = a.storeEntry(c);
if (!ok) {
_statusMessages.add("Blog post " + uri.toString() + " has an invalid signature");
continue;
} else {
_statusMessages.add("Blog post " + uri.toString() + " imported");
}
}
}
BlogManager.instance().getArchive().regenerateIndex();
} catch (IOException ioe) {
ioe.printStackTrace();
_statusMessages.add("Error importing from " + HTMLRenderer.sanitizeString(url) + ": " + ioe.getMessage());
} finally {
if (zi != null) try { zi.close(); } catch (IOException ioe) {}
file.delete();
}
}
public void transferFailed(String url, long bytesTransferred, long bytesRemaining, int currentAttempt) {
_statusMessages.add("Fetch of " + HTMLRenderer.sanitizeString(url) + " failed after " + bytesTransferred);
_tmp.delete();
}
}
public void postSelectedEntries(User user, Map parameters) {
String entries[] = ArchiveViewerBean.getStrings(parameters, "localentry");
if ( (entries == null) || (entries.length <= 0) ) return;
List uris = new ArrayList(entries.length);
for (int i = 0; i < entries.length; i++)
uris.add(new BlogURI(entries[i]));
post(uris, user);
}
private void post(List blogURIs, User user) {
List files = new ArrayList(blogURIs.size()+1);
Set meta = new HashSet(4);
Map uploads = new HashMap(files.size());
String importURL = getImportURL();
_statusMessages.add("Uploading through " + HTMLRenderer.sanitizeString(importURL));
for (int i = 0; i < blogURIs.size(); i++) {
BlogURI uri = (BlogURI)blogURIs.get(i);
File blogDir = new File(BlogManager.instance().getArchive().getArchiveDir(), uri.getKeyHash().toBase64());
BlogInfo info = BlogManager.instance().getArchive().getBlogInfo(uri);
if (!meta.contains(uri.getKeyHash())) {
uploads.put("blogmeta" + meta.size(), new File(blogDir, Archive.METADATA_FILE));
meta.add(uri.getKeyHash());
_statusMessages.add("Scheduling upload of the blog metadata for " + HTMLRenderer.sanitizeString(info.getProperty(BlogInfo.NAME)));
}
uploads.put("blogpost" + i, new File(blogDir, uri.getEntryId() + ".snd"));
_statusMessages.add("Scheduling upload of " + HTMLRenderer.sanitizeString(info.getProperty(BlogInfo.NAME))
+ ": " + getEntryDate(uri.getEntryId()));
}
EepPost post = new EepPost();
post.postFiles(importURL, _proxyHost, _proxyPort, uploads, new Runnable() { public void run() { _statusMessages.add("Upload complete"); } });
}
private String getImportURL() {
String loc = _remoteLocation.trim();
int archiveRoot = loc.lastIndexOf('/');
int syndieRoot = loc.lastIndexOf('/', archiveRoot-1);
return loc.substring(0, syndieRoot + 1) + "import.jsp";
}
public void renderDeltaForm(User user, ArchiveIndex localIndex, Writer out) throws IOException {
Archive archive = BlogManager.instance().getArchive();
StringBuffer buf = new StringBuffer(512);
buf.append("<b>New blogs:</b> <select name=\"blog\"><option value=\"ALL\">All</option>\n");
Set localBlogs = archive.getIndex().getUniqueBlogs();
Set remoteBlogs = _remoteIndex.getUniqueBlogs();
int newBlogs = 0;
for (Iterator iter = remoteBlogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
if (!localBlogs.contains(blog)) {
buf.append("<option value=\"" + blog.toBase64() + "\">" + blog.toBase64() + "</option>\n");
newBlogs++;
}
}
if (newBlogs > 0) {
out.write(buf.toString());
out.write("</select> <input type=\"submit\" name=\"action\" value=\"Fetch metadata\" /><br />\n");
}
int newEntries = 0;
int localNew = 0;
out.write("<table border=\"1\" width=\"100%\">\n");
List entries = new ArrayList();
for (Iterator iter = remoteBlogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
buf.setLength(0);
int shownEntries = 0;
buf.append("<tr><td colspan=\"5\" align=\"left\" valign=\"top\">\n");
BlogInfo info = archive.getBlogInfo(blog);
if (info != null) {
buf.append("<a href=\"" + HTMLRenderer.getPageURL(blog, null, -1, -1, -1, user.getShowExpanded(), user.getShowImages()) + "\"><b>" + HTMLRenderer.sanitizeString(info.getProperty(BlogInfo.NAME)) + "</b></a>: " +
HTMLRenderer.sanitizeString(info.getProperty(BlogInfo.DESCRIPTION)) + "\n");
} else {
buf.append("<b>" + blog.toBase64() + "</b>\n");
}
buf.append("</td></tr>\n");
buf.append("<tr><td>&nbsp;</td><td nowrap=\"true\"><b>Posted on</b></td><td nowrap=\"true\"><b>#</b></td><td nowrap=\"true\"><b>Size</b></td><td width=\"90%\" nowrap=\"true\"><b>Tags</b></td></tr>\n");
entries.clear();
_remoteIndex.selectMatchesOrderByEntryId(entries, blog, null);
for (int i = 0; i < entries.size(); i++) {
BlogURI uri = (BlogURI)entries.get(i);
buf.append("<tr>\n");
if (!archive.getIndex().getEntryIsKnown(uri)) {
buf.append("<td><input type=\"checkbox\" name=\"entry\" value=\"" + uri.toString() + "\" /></td>\n");
newEntries++;
shownEntries++;
} else {
String page = HTMLRenderer.getPageURL(blog, null, uri.getEntryId(), -1, -1,
user.getShowExpanded(), user.getShowImages());
buf.append("<td><a href=\"" + page + "\">(local)</a></td>\n");
}
buf.append("<td>" + getDate(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + getId(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + _remoteIndex.getBlogEntrySizeKB(uri) + "KB</td>\n");
buf.append("<td>");
for (Iterator titer = new TreeSet(_remoteIndex.getBlogEntryTags(uri)).iterator(); titer.hasNext(); ) {
String tag = (String)titer.next();
buf.append("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, -1, -1, -1, user.getShowExpanded(), user.getShowImages()) + "\">" + tag + "</a> \n");
}
buf.append("</td>\n");
buf.append("</tr>\n");
}
if (shownEntries > 0) {
out.write(buf.toString());
buf.setLength(0);
}
int remote = shownEntries;
// now for posts in known blogs that we have and they don't
entries.clear();
localIndex.selectMatchesOrderByEntryId(entries, blog, null);
buf.append("<tr><td colspan=\"5\">Entries we have, but the remote Syndie doesn't:</td></tr>\n");
for (int i = 0; i < entries.size(); i++) {
BlogURI uri = (BlogURI)entries.get(i);
if (!_remoteIndex.getEntryIsKnown(uri)) {
buf.append("<tr>\n");
buf.append("<td><input type=\"checkbox\" name=\"localentry\" value=\"" + uri.toString() + "\" /></td>\n");
shownEntries++;
newEntries++;
localNew++;
buf.append("<td>" + getDate(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + getId(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + localIndex.getBlogEntrySizeKB(uri) + "KB</td>\n");
buf.append("<td>");
for (Iterator titer = new TreeSet(localIndex.getBlogEntryTags(uri)).iterator(); titer.hasNext(); ) {
String tag = (String)titer.next();
buf.append("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, -1, -1, -1, user.getShowExpanded(), user.getShowImages()) + "\">" + tag + "</a> \n");
}
buf.append("</td>\n");
buf.append("</tr>\n");
}
}
if (shownEntries > remote) // skip blogs we have already syndicated
out.write(buf.toString());
}
// now for posts in blogs we have and they don't
int newBefore = localNew;
buf.setLength(0);
buf.append("<tr><td colspan=\"5\">Blogs the remote Syndie doesn't have</td></tr>\n");
for (Iterator iter = localBlogs.iterator(); iter.hasNext(); ) {
Hash blog = (Hash)iter.next();
if (remoteBlogs.contains(blog)) {
//System.err.println("Remote index has " + blog.toBase64());
continue;
}
entries.clear();
localIndex.selectMatchesOrderByEntryId(entries, blog, null);
for (int i = 0; i < entries.size(); i++) {
BlogURI uri = (BlogURI)entries.get(i);
buf.append("<tr>\n");
buf.append("<td><input type=\"checkbox\" name=\"localentry\" value=\"" + uri.toString() + "\" /></td>\n");
buf.append("<td>" + getDate(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + getId(uri.getEntryId()) + "</td>\n");
buf.append("<td>" + localIndex.getBlogEntrySizeKB(uri) + "KB</td>\n");
buf.append("<td>");
for (Iterator titer = new TreeSet(localIndex.getBlogEntryTags(uri)).iterator(); titer.hasNext(); ) {
String tag = (String)titer.next();
buf.append("<a href=\"" + HTMLRenderer.getPageURL(blog, tag, -1, -1, -1, user.getShowExpanded(), user.getShowImages()) + "\">" + tag + "</a> \n");
}
buf.append("</td>\n");
buf.append("</tr>\n");
localNew++;
}
}
if (localNew > newBefore)
out.write(buf.toString());
out.write("</table>\n");
if (newEntries > 0) {
out.write("<input type=\"submit\" name=\"action\" value=\"Fetch selected entries\" /> \n");
out.write("<input type=\"submit\" name=\"action\" value=\"Fetch all new entries\" /> \n");
} else {
out.write(HTMLRenderer.sanitizeString(_remoteLocation) + " has no new posts to offer us\n");
}
if (localNew > 0) {
out.write("<input type=\"submit\" name=\"action\" value=\"Post selected entries\" /> \n");
}
out.write("<hr />\n");
}
private final SimpleDateFormat _dateFormat = new SimpleDateFormat("yyyy/MM/dd", Locale.UK);
private String getDate(long when) {
synchronized (_dateFormat) {
return _dateFormat.format(new Date(when));
}
}
private final String getEntryDate(long when) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(when));
long dayBegin = _dateFormat.parse(str).getTime();
return str + "." + (when - dayBegin);
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return "unknown";
}
}
}
private long getId(long id) {
synchronized (_dateFormat) {
try {
String str = _dateFormat.format(new Date(id));
long dayBegin = _dateFormat.parse(str).getTime();
return (id - dayBegin);
} catch (ParseException pe) {
pe.printStackTrace();
// wtf
return id;
}
}
}
}

View File

@@ -0,0 +1,9 @@
<%@page contentType="text/html; charset=UTF-8" import="net.i2p.syndie.web.ArchiveViewerBean, net.i2p.syndie.*" %>
<% request.setCharacterEncoding("UTF-8"); %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" /><table border="0" width="100%">
<tr><form action="index.jsp"><td nowrap="true">
<b>Blogs:</b> <%ArchiveViewerBean.renderBlogSelector(user, request.getParameterMap(), out);%>
<input type="submit" value="Refresh" />
<input type="submit" name="action" value="<%=ArchiveViewerBean.SEL_ACTION_SET_AS_DEFAULT%>" />
<!-- char encoding: [<%=response.getCharacterEncoding()%>] content type [<%=response.getContentType()%>] Locale [<%=response.getLocale()%>] -->
<%ArchiveViewerBean.renderBlogs(user, request.getParameterMap(), out, "</td></form></tr><tr><td align=\"left\" valign=\"top\">");%></td></tr></table>

View File

@@ -0,0 +1,3 @@
<%@page import="net.i2p.syndie.web.ArchiveViewerBean, net.i2p.syndie.*, net.i2p.data.Base64" %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<jsp:useBean scope="session" class="net.i2p.syndie.data.TransparentArchiveIndex" id="archive" />

View File

@@ -0,0 +1 @@
<!-- nada -->

View File

@@ -0,0 +1,5 @@
<%@page import="net.i2p.syndie.BlogManager" %>
<jsp:useBean scope="session" class="net.i2p.syndie.User" id="user" />
<!--
<center>[syndiemedia]</center>
-->

Some files were not shown because too many files have changed in this diff Show More