Speed of 64-bit Pro/E and Pro/M?

Discussion in 'Pro/Engineer & Creo Elements/Pro' started by dgeesaman, Mar 19, 2007.

  1. dgeesaman

    dgeesaman Guest

    Well it's well established that 64 bit platform opens the door for
    making larger models. What isn't consistent is how the peformance
    changes - the benchmarks I've seen show apps getting faster, getting
    slower, staying the same.

    For those of you who have crossed the divide, has the performance
    improved or slowed, all other things held equal? Is memory usage
    changed?

    Dave
     
    dgeesaman, Mar 19, 2007
    #1
  2. dgeesaman

    dgeesaman Guest

    I recently came across a PTC FAQ document that states the 64-bit
    version can be 5-20% slower than the 32-bit equivalent solution.
    Whether that's true in all cases leaves plenty of room for comment.

    Dave
     
    dgeesaman, Mar 19, 2007
    #2
  3. e an assembly or analysis that runs over (by a Task Manager Process display of xtop.exe activity) about half of the 32 bit rated capacity, hey, no worries. If you're on or past the margin, get the 64 bit architecture because the demands will get worse, only worse. And your scale of comparison will be pass/fail, not minute percentage differences.
    It is not a black/white issue, since we don't run the same exact
    analysis all day every day. Experienced FEA users know there is almost
    always more simple and less resource-intensive set of assumptions.

    If Mechanica turns out to run roughly the same speed in 64 bit platform,
    then it's a matter of cost and other issues whether to run on 64-bit.
    You gain the ability to run larger analyses, but if there is an overall
    performance penalty, then from a user point of view the tradeoff is
    between larger model limits vs. performance on smaller analyses.

    So I ask again, if anyone has compared Pro/M in 32- and 64-bit platforms
    for raw performance.

    Dave
     
    David Geesaman, Mar 20, 2007
    #3
  4. dgeesaman

    David Janes Guest

    I recently came across a PTC FAQ document that states the 64-bit
    version can be 5-20% slower than the 32-bit equivalent solution.
    Whether that's true in all cases leaves plenty of room for comment.

    Dave

    I think you stated the most cogent and convincing argument in your first post. It's pretty black and white. I can't run it (assembly, analysis, Mechanica, Mechanisms, rendering, any highly memory/storage intensive function) in a 32 bit environment but I can (without errors, freezing, crashing) with 64 bit hardware, software and OS. If you're calculating advantage on a finer accuracy scale than this, I think PTC is correct: it weighs in favor of 32 bit architecture (and from everything I've seen, as well). But, perhaps, marginally in many cases. The rest of the argument is how do you compare systems? Eventually, by price, you won't be able to avoid 64 bit machines because they've become so much more prevalent; OSes you can get in any architecture, programs, as well. So it depends on those elusive bench marks which are a) few and far between when it comes to realistic Pro/e use and b) never compare at the margin of utility (where 32 bit is under stress). So, if you never have an assembly or analysis that runs over (by a Task Manager Process display of xtop.exe activity) about half of the 32 bit rated capacity, hey, no worries. If you're on or past the margin, get the 64 bit architecture because the demands will get worse, only worse. And your scale of comparison will be pass/fail, not minute percentage differences.

    David Janes
     
    David Janes, Mar 20, 2007
    #4
  5. dgeesaman

    David Janes Guest

    e an assembly or analysis that runs over (by a Task Manager Process display of xtop.exe activity) about half of the 32 bit rated capacity, hey, no worries. If you're on or past the margin, get the 64 bit architecture because the demands will get worse, only worse. And your scale of comparison will be pass/fail, not minute percentage differences.
    It is not a black/white issue, since we don't run the same exact
    analysis all day every day. Experienced FEA users know there is almost
    always more simple and less resource-intensive set of assumptions.

    If Mechanica turns out to run roughly the same speed in 64 bit platform,
    then it's a matter of cost and other issues whether to run on 64-bit.
    You gain the ability to run larger analyses, but if there is an overall
    performance penalty, then from a user point of view the tradeoff is
    between larger model limits vs. performance on smaller analyses.

    So I ask again, if anyone has compared Pro/M in 32- and 64-bit platforms
    for raw performance.

    Dave
    I'd suggest you check here for the broadest, most general experience:
    http://proesite.com/ and check the benchmarks for 32 and 64 bit machines. Hopefully, this represents 32 bit machines running 32 bit apps and 64 bit machines running 64 bit apps and not 64 bit machines/OSes running 32 bit apps. These can be very difficult, very complicated analyses, requiring testing professionals to get involved with their scientifially set up testing labs (it's all about the numbers). And the numbers can be all about the test set up (witness the number of different results from the same machine.) But, in general, note that the results are more than twice the time for the 64 bit machines.

    David Janes
     
    David Janes, Mar 20, 2007
    #5
  6. dgeesaman

    Ant Guest

    At my last company we were hitting the memory brick wall on 32 bit.
    As soon as we went from a 4GB 32 bit PC to a 12GB 64 bit PC some
    analysis were reduced from 24 hours to 3 hours. Bit it did crash a
    bit more often.

    Does anyone know what the benefits of Vista might be? 32 bit & 64
    bit?

    Ant
     
    Ant, Mar 20, 2007
    #6
  7. Based on what I know of Vista, it does not change anything w.r.t. memory
    size, etc compared to XP Pro 32 and 64 bit. Except that Vista uses more
    resources for itself, all other things held equal.

    Dave
     
    David Geesaman, Mar 20, 2007
    #7
  8. dgeesaman

    David Janes Guest


    We ran a simple analysis on 3 machines, Dell 370 p4 @ 3.2GHz w/2GB, Dell
    390 DuoCore @ 3GHz w/4GB and a Dell 670 P4 @3.2GHz w/4GB running Win XP64.

    The 370 bogged down and ran out of speed part way through the analysis. It
    did finish, but the CPU was running at 100%.
    The 390 plowed right through it in 64% of the 370 time.
    The 670 plowed through it in 67% of the 370 time.

    Your numbers may vary, depending on actual machine specs.

    --
    Ben


    The numbers are helpful, Ben, thanks for the insight. And because I think the numbers can, at times, be helpful, I presented the somewhat bigger picture you can get from Corten's Proesite.com. But I'm always baffled by how to evaluate the results. Yours were surprising. I would have expected the numbers for the 390, with its several advantages, (2x memory, dual core processor, 50% faster FSB and RAM) to be much better than a mere 67% of the bogged down machine. The doubled RAM alone should have given it an enormous advantage, but, with a dual core processor that didn't even exist when the 370 was released, the 390's times should have been less than half those of the 370 (my guestimate from other reviews). But the most surprising thing is the results of the of the 670 (assuming no dual core processor) with no apparent advantage over the 390, other than 64 bit architecture, and it's 50% faster! for no apparent reason!!! that really is surprising. Any theories of why? Possibly that, for reliable results, (i.e., where interpretation can be better than guesswork), this type of testing ought to be left to testing professionals!?! It seems like, when those professionals do that kind of test, they set up a 32 bit Dell 670 system along side a 64 bit Dell 670 system (no hardware/software difference save the architecture) and run the same test on each. Those results might say something about the advantages (again, under what conditions and levels of stress to the system) of using a particular architecture. But I don't need a 3 machine faceoff to tell me I want a 64 bit machine if I'm regularly exceeding memory limits and freezing/crashing the analysis!! So, the issue, in my estimation, comes back to what I raised in my original post: get the right tool for the job. If you're a tin or silver smith, you don't need the rock crushing 6 pound sledge, you need the half ounce ball peen. It's a question of scale ~ your problems vs your resources: if they are evenly matched, you'll quickly and easily exceed them. Then you need to acquire more resources. 4 gigs is the addressable limit on 32 bit OSes; when you need 6, 8, 16, you need a 64 bit system. Also true if you know you are outgrowing your present system because of constantly greater demands on resources. So, it's demands vs resources PLUS tendencies.

    Anyway, thanks for the faceoff. It was interesting and thought provoking. Doubtless you learned some things as well.

    David Janes
     
    David Janes, Mar 21, 2007
    #8
  9. dgeesaman

    David Janes Guest

    At my last company we were hitting the memory brick wall on 32 bit.
    As soon as we went from a 4GB 32 bit PC to a 12GB 64 bit PC some
    analysis were reduced from 24 hours to 3 hours. Bit it did crash a
    bit more often.

    Does anyone know what the benefits of Vista might be? 32 bit & 64
    bit?

    Ant

    That's a very good question and I don't know the answer. But, from all I've heard, Vista is simply XP Plus, so it more or less depends on what you found XP to be and what you thought of its limitations. One thing I've not seen is a list of enhancements over XP or XP limitations that got surpassed with Vista. IOW, I've not seen one single advantage to upgrading.

    David Janes
     
    David Janes, Mar 21, 2007
    #9
  10. dgeesaman

    Ant Guest

    Someone told me that you can access usb stick memory when using vista
    to prevent it from swapping to the HDD as much. I don't know if this
    applies to Vista 32 bit though.
     
    Ant, Mar 21, 2007
    #10
  11. dgeesaman

    dgeesaman Guest

    Thanks Ben.

    Today I set up RAID 1 on my wife's server and that has liberated a
    couple of hard drives. So this weekend I'll install XP Pro 64 bit on
    one of them and Pro/E, Pro/M 64 bit also.

    This way I can do a test with nearly all other things held equal. The
    system will only have 2GB of RAM, but should be plenty sufficient to
    get a raw speed comparison for modestly sized models.

    Dave
     
    dgeesaman, Mar 21, 2007
    #11
  12. dgeesaman

    David Janes Guest

    Thanks Ben.

    Today I set up RAID 1 on my wife's server and that has liberated a
    couple of hard drives. So this weekend I'll install XP Pro 64 bit on
    one of them and Pro/E, Pro/M 64 bit also.

    This way I can do a test with nearly all other things held equal. The
    system will only have 2GB of RAM, but should be plenty sufficient to
    get a raw speed comparison for modestly sized models.

    Dave

    The one thing that I've never seen measured is a standalone license of Pro/e vs a networked one. Incomprehensible, considering how heavily networked Pro/e is and how MUCH this can influence performance. And HD speed is also hardly ever quantified. But I have experienced huge delays when Pro/e decides to start disk swapping. Good luck on your analysis. Submit to proesite for comparison value, especially to compare a few of your own configs to the average. Again, what it means, for troubleshooting purposes, I'm not sure.

    David Janes
     
    David Janes, Mar 22, 2007
    #12
  13. dgeesaman

    dgeesaman Guest

    Interesting question. However, my understanding is that the license
    server traffic is somewhere between tiny and miniscule. So I don't
    see how that could affect performance.
    As with anything, RAM is faster than HDD swapping. Faster hard drives
    almost always cost more than adding more RAM, and are substantially
    lower bang-for-the-buck. If disk swapping is slowing you down, put in
    RAM until the swapping stops.
    My intent to compare Mechanica runtimes isn't supported bye the OCUS
    benchmark, but I'll try to run that too if I find time. Is it true
    that the 32- and 64- bit versions are indeed identical scripts?

    I'm a little consumed right now with configuring a new SBS2003
    arrangement for my spouse's company. This 32- vs. 64- comparison is
    not on a pressing schedule.

    Dave
     
    dgeesaman, Mar 22, 2007
    #13
  14. Good point. Some models are large in footprint, and generate huge temp
    files, models need a lot of RAM and CPU time but don't create large temp
    files. I totally agree - if you're seeing gigs upon gigs being written
    to disk then a RAID0 setup is worthwhile.
    PTC once told us to set it for 40% of the system RAM. It seems to work
    ok for us, although the parameter hasn't seem to make much difference
    either way.

    Dave
     
    David Geesaman, Mar 23, 2007
    #14
Ask a Question

Want to reply to this thread or ask your own question?

You'll need to choose a username for the site, which only take a couple of moments (here). After that, you can post your question and our members will help you out.