Page 1 of 1

3D SiteMap Capacity Estimates?

PostPosted: Thu Jan 22, 2009 7:58 am
by ruben.martinez
Thanks for your prompt response. The eval/evauto now allowed me to get a
small 3d sitemap of a web site.

I can see that the Num. Links and Num. URLs are limited to 500 and 100
respectively. I read the forum threads about capacity

- viewtopic.php?f=13&t=190
- viewtopic.php?f=13&t=250

but a few points are still unclear to me.

- Assuming a powerful machine with 3GB+ RAM, would the 3D rendering app
work with a site tens of even hundreds of thousands of URLs big? By URLs I
mean actual html documents with extensions like css, jpg filtered out,
off-site pages filtered, subdomains excluded, etc.
- It if can deal with such sizes, would it still be workable, ie. would
the zoom and browsing functionality would be fast enough to allow an
inspection of the 3d sitemap?
- Can the 3d sitemap display the edges connecting parents and/or children
by default, without interacting with thh rendering with the cursor of the
mouse?

Thank you in advance for your answers.

Best Regards,
Ruben Martinez

Re: 3D SiteMap Capacity Estimates?

PostPosted: Thu Jan 22, 2009 9:03 am
by eValid
(1) Architecture

Some facts about the 3D-SiteMap archictecture may help resolve this:

(a) The 3D-SiteMap is a Java Applet, so it runs in the JRE.

(b) The data it reads is the set of link pairs, so if you think about it some you'll realize that the storage requirement is a function of how long the links hemselves are.

(c) We know from experience that very large maps, e.g. those with over 10,000 links, experience degraded performance. Such a very large collection of links has to be read in, the depth and positioning calculated, and the results displayed.

(d) The algorithms used in the 3D-SiteMap are very efficient and run entirely in RAM, but as you know, if you run out of RAM then Windows and the JRE will start allocated space on the swapping device...nothing is lost, but things get slower.

(e) The bottom line is: you have to try it to see if the combination of link density, link string length, total number of links, and depth of the tree does or doesn't fit on your machine.

As they say sometimes in the auto showroom: Your Mileage Will Vary.

But to give you comfort on the size point, 10,000's of links, yes, 100,000's of links, maybe not.

(2) Workability

In terms of "workability" that is a different matter. The zoom and browzing functions work -- possibly slowing down on the larger maps -- but another limiting factor has to be taken into account: the utility of the picture.

We think that in practice that you have tomaybe consider all of these:

(a) Snip off the unnecssary references, as you correctly suggest, leaving out pictures (gif's and jpg's and png's and css's and the like).

(b) Prune the tree by limiting the depth.

(c) Re-focus the center of the 3D-SiteMap to the single page you want to learn about...

All of these will, with practice, yield good insights...which is what it is all about.

(3) Edge Display

Actually, the default display shows all of the children and parents of the root node, without doing anything.

When you mouse over some node the display reconfigures to show the same data for just that local set of dependencies.

Hope this answers your questions.

eValid Support

Re: 3D SiteMap Capacity Estimates?

PostPosted: Fri Jan 23, 2009 7:26 am
by ruben.martinez
Thank you for the answers. They all make sense. The software is ambitious and pretty impressive, I have to say.

good topic

PostPosted: Sun Mar 01, 2009 2:49 pm
by HolyBat
+1

привет

PostPosted: Mon Mar 02, 2009 11:30 pm
by NipelHOD
best topic