RE: [xep-support] Slow Performance

From: Kevin Ross (Kevin.Ross@iVerticalLeap.com)
Date: Wed Feb 19 2003 - 05:56:43 PST

  • Next message: Jeff Beal: "RE: [xep-support] row-groups"

    >>> increase the JVM's initial ram to 512m

    Do you mean you upped the min or the max(Xmx)?

    I upped the max memory on the JVM and saw a huge increase in
    performance. If you didn't use -Xmx, try adding -Xmx400mb to your batch
    file.

    -Kevin

    -----Original Message-----
    From: owner-xep-support@www.renderx.com
    [mailto:owner-xep-support@www.renderx.com] On Behalf Of
    mottoson@greenshield.ca
    Sent: Wednesday, February 19, 2003 7:11 AM
    To: xep-support@renderx.com
    Subject: [xep-support] Slow Performance

    Hi,

    We're running XEP Version 3.2.1 Client Edition and have hit a
    performance wall. The .fo file I'm testing is 42MB composed of approx.
    600 fo:page-sequences. The
    .fo does contain quite a number of tables, none nested.

    During the "(document" step, xep appears to process the first 400 in an
    acceptible time then drops to a rate of approx. 4 min/sequence. Finally
    after
    434 it dies with java.lang.OutOfMemoryError. It never hits the
    "(format" step.

    This test file is a moderate size. We expect worse case files
    potentially 20
    times greater. We could, of course, split the files up. To test this I
    split
    the same file into the following breakdown, and each produced the final
    pdf. I'll express the performance in terms of total render time and pdf
    pages/min

    29MB - 6 hours, 2.3 pg/min
    13MB - 2.5 min, 271.2 pg/min
    0.5MB - 15 sec, 248 pg/min
    31K - 8 sec, 30 pg/min
    8K - 7 sec, 8.6 pg/min

    The degradation of the smaller files, I'm sure, is attributable to
    startup/shutdown overhead. Is there something I can do to get past the
    degradation of the larger files? Can you recommend ways to tune the .fo
    file to
    allow XEP to process it faster and completely?

    If we must split the files (not our preferred solution), it would be
    easier to
    split into files around the 31K size. Is there a way with our version
    to
    process multiple files and avoid the startup/shutdown overhead?

    Platform details:
    - XEP Version 3.2.1 Client Edition
    - Windows NT 4.0 SP6a
    - dual processor P4, 2.0 GHz, 512MB ram.
    - Java(TM) 2 Runtime Environment, Standard Edition (build 1.4.1-b21)
    - The only changes to the out-of-box settings I've made is to disable
    validation
    and to increase the JVM's initial ram to 512m.
    This e-mail is confidential and is intended solely for the use of the
    person
    or organization
    to which it was addressed. If you have received it in error, please
    notify
    us immediately by
    replying to the sender.
    -------------------
    (*) To unsubscribe, send a message with words 'unsubscribe xep-support'
    in the body of the message to majordomo@renderx.com from the address
    you are subscribed from.
    (*) By using the Service, you expressly agree to these Terms of Service
    http://www.renderx.com/tos.html

    -------------------
    (*) To unsubscribe, send a message with words 'unsubscribe xep-support'
    in the body of the message to majordomo@renderx.com from the address
    you are subscribed from.
    (*) By using the Service, you expressly agree to these Terms of Service http://www.renderx.com/tos.html



    This archive was generated by hypermail 2.1.5 : Wed Feb 19 2003 - 05:50:08 PST