Viewing 15 posts - 1 through 15 (of 17 total)
  • Author
    Posts
  • #11397
    JLove
    Participant

    The strategy I initially elected to deal with NV2 learning issues as well as preparing for conversion was to go ahead with a conversion of my existing books on a separate machine from our production environment. “Playing” with real data that I thoroughly understand seems to me to be a better course than using the “demo” books which model someone else’s business. At least that was my thinking two days ago.

    Exporting from NV1 using the newest (v70) export program worked reasonably well, so I proceeded to the import phase.

    I began the import on Friday afternoon relying on the default settings, and figuring that it would be good to let the program run over the weekend. The computer is not particularly fast, but has 1G of memory and ample disk space, so I was not expecting any major hardware related bottlenecks. The dataset is composed of about 10 years transactions and there are about 200k of them.

    The import program batches the transactions and the first batch ran in 11 minutes. Seemed pretty good to me. I went home and returned Saturday to check up on things. I discovered that the main progress bar had only moved from 4% to 30% in almost 24 hours. Each successive batch added 1 full hour to the processing time of the preceding batch, so the last batch that had completed ran in 5 hours.

    I took some steps I hoped would improve the speed such as dumping all unnecessary auxiliary applications and left to let the import program continue through the weekend.

    It is now mid-day on Sunday, and the progress bar has moved to a little over 50%. About 100k transactions have been imported, but the last batch took 11 hours to run, so at this rate it seems like it might take weeks for this program to complete.

    So the basic question is what should I do? Should I stop the import program and change any settings to improve the speed or efficiency? The documentation seems to suggest that stopping and continuing at a later time or date is supported, but I’m a little nervous about whether there is a potential for losing out on the work done so far.

    Another possible strategy is to start over with a truncated data set–for example, create a data set with just the last couple of years instead of a complete set.

    Another thought–if I stop the import program, will it be possible to use the already imported data to work with NV2, or must I complete the import process before starting the program in operating mode?

    Finally, it would be helpful to have some understanding of why the import process is taking so long–does this have some implication for whether it makes sense to use a dataset as large as ours? Should we be thinking more about importing less historical data–with perhaps an eye to back-loading the data after QW Page has had some more time to optimize the program?

    I do well remember the early days of NV when it seemed like an incredible resource-hungry program. As the years past and the hardware got bigger, cheaper and faster, NV seemed more like a speed demon with each passing season. I suppose we can tolerate a little return to the days of yore while the new system “shakes out”, but it would be great to have a better idea of what we’re in for.

    All help and suggestions gratefully accepted. :-)

    –Jack Love
    A happy NV1 customer since 1986.

    #12887
    JLove
    Participant

    Since no one’s replied (yet) thought I’d let anyone reading this forum know that as of Monday afternoon, we are up to 70 hours processing and roughly 68% completion…

    –Jack Love

    #12888
    JPeters
    Participant

    You are correct to start with set of books that you know thoroughly.
    If you have a small set of books under 50 Mb, that would be good to start with.

    When converting books larger than 50 Mb, you should use “transaction compression”.

    We recommend that you set the “Begin Full Detail” date 12 months back from your DOS books fiscal year end.

    With “transaction compression” all your data is still transferred to NV2. Data older than the “Begin Full Detail” date will be summarized monthly . This summarized data can be found in the “Compressed Transactions Journal”.

    Our recommendation is to “decompress” one month of transactions (i.e. the last row of the journal – and work backwards) each week night and a block of two or three transactions each weekend. This way, while the computer would normally be idle, you can use the time to systematically decompress the “compressed data” in NV2.

    Our new html manual is searchable, please check the references made to “compress” and “decompress”.

    Regarding NV2 performance in general, you are correct to note that in the beginning NV1 was a resource hungry program, but it got faster and faster as computers got faster, and Q.W. Page added performance enhancements. The same will be true with NV2.

    The simple explanation for NV2’s current performance can be summarized as follows:

    1) NV2 is based on an entirely new database technology that permits real-time, multi-user access and system-wide database updating, without the need for batches, or file/record locking.

    2) All data values in NV2 are “variable length”. There is a performance price to pay for index keys that can be any length.

    3) Account ledgers in NV1 were only kept for posting accounts. Total accounts stored only a summary of activity. This was a major restriction, and it created all sorts of problems for users with odd reporting periods, changing fiscal year ends, and so on. In NV2 all transactions “ripple” to all total accounts. This means you can view the ledger detail of total accounts, but, more importantly, you can report on any period (e.g. monthly for prior years without limit, weekly, daily, etc.)

    Q.W. Page included transaction compression/decompression to address the challenge of importing years of activity, into books with deep and complicated total to structures. For example, we have seen a set of books where the average posting affected 95 total accounts. In an NV2 conversion of this set of books, each posting created approximately 1,000 ledger index entries. Without transaction compression, this set of books may have taken a week or two, or longer, to import. With compression (set prior to the current fiscal year begin), the books converted overnight or over a weekend.

    NOTE: You can find diagnostics on your total to structure in the file NV1_NV2.totalsinfo.
    This file is created (in the directory the books are in) after all the total to’s are connected, and before transactions are imported. You can open NV1_NV2.totalsinfo with a word processor. If your “AverageTotaltos” is greater than 20, you have a reasonably complicated set of books and the import could take a while without transaction compression.

    Several promising performance enhancing strategies are in the works. NV2 will only get faster.

    Q.W. Page

    #12891
    JLove
    Participant

    Thanks for the excellent and detailed reply–I hope others can benefit from it. I will certainly make use of the suggestions for subsequent conversions. At this time, with about 80% done, I think I might as well let the process finish. It’s not hogging all my CPU resources, so I can work while it chugs away in the background, and I’m using some of the time to review the documentation.

    –Jack Love

    #12892
    CBenzel
    Participant

    I have just succeeded in getting a small set of books converted and I think up and running. This has taken me over a week to complete because I feel your documentation is lacking.

    I am not pleased with what I see. My computer has the 1 G of RAM with a 2.40 GHz processor. This accounting package has dragged my computer to an almost standstill. Since I have another much bigger set of books, I am seriously considering other options such as just staying with the DOS version.

    Your manual does not clearly state what to do. The reports need a lot of manipulating.

    What happens to the “export” items which made entry of repeating transactions possible?

    I might add that this is not my name or email address.

    Jean Motherwell

    #12893
    MSchappler
    Moderator

    Jean,

    Click on the NV2 Help Manual and search for Block Copy. You will see information regarding Block Copy and Block Paste.

    Regards,

    Martin

    #12894
    HMah
    Participant

    Jack L.

    I’ve always been bothered by the time it takes to convert to NV2. The worse problem was with every NV2 update you had to re-import the nv1 data.

    It’s hard to find the time to actually use nv2 when it takes 3-6 days to convert the nv1 data.

    Is QW Page still recommending the Ctrl key be held down (a weight place on it) during the NV2 import of the NV1 data? Even it increases the speed by 10% it’s still a savings.

    #12895
    DEholnikof
    Participant

    Ah – ever try doing something else in Windows while holding down the Ctrl Key? /// or did you forget focus moves with the window, eh?

    HMah wrote:

    > Jack L.
    >
    > I’ve always been bothered by the time it takes to convert to
    > NV2. The worse problem was with every NV2 update you had to
    > re-import the nv1 data.
    >
    > It’s hard to find the time to actually use nv2 when it takes
    > 3-6 days to convert the nv1 data.
    >
    > Is QW Page still recommending the Ctrl key be held down (a
    > weight place on it) during the NV2 import of the NV1 data?
    > Even it increases the speed by 10% it’s still a savings.
    >

    #12899
    BHalpin
    Participant

    We never suggested holding the Ctrl-key down during import – only during the export from te DOS version.

    It makes no difference in the import.

    #12900
    LKearney
    Participant

    Hi

    I am in the process of converting a large set of books. I have used transaction compression as it would take days, perhaps weeks to convert otherwise.

    What I have done that helps is to create a RAMDISK on my workstation and copy the data to be converted to there. Running the convert on a ramdisk speeds things up considerably (5-10x). My only problem with this strategy thus far is that the NewViews conversion program assumes that the temporary files it creates will be located with the data being converted. To make the RAMDISK work, it must be three times the size of the converted books plus have room for the data to be converted.

    I have asked TechSupport if they can give users the option of specifying the location of the backup & import data files seperately from the location of the database being created. That way the RAMDISK would only need contain the database being created. As backups etc are only created periodically, they could be located on a physical hard drive and not on the ram disk.

    Perhaps TechSupport could post a note here when they find out if this is possible.

    Cheers

    Les

    #12905
    HMah
    Participant

    Finally converted the 39mb set of books I mentioned in the Pre-release forum, which at that time took 52 hrs. This time it took 52 min for nvexport and 7 hrs for nv2 import (no compression). Great improvement. Good work guys.

    Post Edited (05-19-05 20:10)

    #12906
    BHalpin
    Participant

    Re: Backup to other path

    Sorry – but this would be a low priority. The import is a process that most users will perform only once, therefore tricking the process out to save 10 or 20% of pure machine time would take a back-seat to adding improvements to the program that will benefit them every day.

    #12907
    BHalpin
    Participant

    Re: Henry’s 7 hour import.

    Thank’s for the credit on the speed improvement – however nothing has been added/changed to the process that would have speeded it up in any way. So, I’m glad you’re happy with the lower time, but from our perspective the improvement must be something you did on your side.

    #12910
    LKearney
    Participant

    Re: Backup to other path

    I am sorry to hear that NewViews will not allow users to specify the location of the internediate backup files during conversion. I have a very large set of books that I cannot convert without this change. As a result I cannot meet my client’s needs as the current conversion program would take many days, perhaps longer, to run. Using the suggested conversion / training strategy is completely impossible unless I can reduce the conversion time.

    I know that most users will not have significant “time to convert” problem. However I am sure there a more than a few for which this *is* a significant problem.

    Is there someone at QWPage that I can speak to directly about this?

    Regards

    Les

    #12911
    BHalpin
    Participant

    Les:

    Ok, we’ll add a mechanism to specify an alternate backup path to the next build. (That will probably be on May 30.)

    Until then, if you are following the reccommended converions strategy (ie: converting the books multiple times, etc.) have you tried compressing all but the current year or two?

    A question about your post of May 12 in which you say:

    “To make the RAMDISK work, it must be three times the size of the converted books plus have room for the data to be converted.”

    If I do the math, you’re saying the space required is four times the size of the result. Hmmm, I’m thinking it should be slightly less than three:

    1 – The database that’s being built
    2 – Room for the database recovery file
    (Up to maybe 75% of the current database size)
    3 – The intermediate backup

    How do you get four?

    Bob Halpin
    (Q.W.Page)

Viewing 15 posts - 1 through 15 (of 17 total)
  • You must be logged in to reply to this topic.