Some forms' "make .csv" process times out with an error message

dmajwool

Member
On DaDaBik 8.3 and then 9.0 I get the same behaviour

On some of my pages derived from Views, when there are a greater number of records (like all 600 of them) the csv process times out. If I filter the same page on something to reduce the number of displayed records (like 200) the csv process takes only 3 or 4 seconds. There are 10 fields displayed on the page.

Fatal error: Maximum execution time of 60 seconds exceeded in /var/sites/s/mysite.co.uk/public_html/secretarial/include/business_logic.php on line 0

How can I debug this?

Many thanks, David.


******************
You are using DaDaBIK version 9.0-Monterosso pro, installed on 07-18-2018 (installation code: 0), the latest version of DaDaBIK is 9.0-Monterosso released on 07-05-2018

You are runnning the last release of DaDaBIK

PHP Version: 5.6.22

mysql version: 5.6.21

Web server: Apache

Client: Mozilla/5.0 (Windows NT 6.1; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0
 

dmajwool

Member
Been doing some more testing and it seems like it might genuinely be a timeout. But I don't know how I can optimise things.

My DaDaBik Form has several lookup fields and if I include any one of them in the .csv permissions, the .csv process takes way longer to load and will probably time out.

If I include a looked up field in the csv permissions, but then remove the lookup in the Forms Configurator (so that just the code is shown on the form rather than the looked up meaning of the code), the .csv export happens really quickly.

What I don't quite understand is that the form loads quickly in the browser with all the lookups doing their thing - it's only the .csv report that gets boggy.

Any thoughts?

Many thanks, David.
 

eugenio

Administrator
Staff member
Hello,
first of all, you can increase the max execution time (if it's allowed by your hosting provider) by using the $csv_creation_time_limt in config.php.

When you load a form, you are loading one record, when you create a CSV file, hundreds of records. What is happening is that your CSV file really needs some time to be created

Best,
 

dmajwool

Member
Hi Eugenio,

Thanks for the reply.

I don't have the possibility to increase the max execution time in php.ini. Its not allowed by my hosting provider.

I'm not sure that there isn't something else going on here. This DaDaBik9 site is a rework of a very old DaDaBik installation - maybe v4.0?

This from the top of the old version's index.php
DaDaBIK (DaDaBIK is a DataBase Interfaces Kreator) http://www.dadabik.org/
Copyright (C) 2001-2010 Eugenio Tacchini

...
<?php
include ("./include/config.php");
/* 4.0 */


The old installation exported to csv the same pages including the lookups without any timeout issues on the same hosting provider.

Some further info.

When I select 100 records per page, my browser takes about 3 or 4 seconds to load the next 100 records. So if the total set of records that would appear in the csv export is 650, we might expect the server to take 30 seconds to process the query.

But the server is timing out after 60 seconds.

Running DaDaBik9 I've just timed some exports.

The DaDaBik page takes records from a single MySQL table which has 110 columns and 618 records (it is the Users table created by phpbb).
The DaDaBik page has 10 columns and three of these are looked up from one other table called DropDownListTextValues.

If I do a quick search in DaDaBik (ie. user_id <50) I can control the size of the csv export. All the following quick search results take about 3 seconds to load in DaDaBik.

Just now I timed some csv exports (the time taken until the save file dialog pops up)

Records - 34. CSV completes in 33 secs
Records - 54. CSV completes in 34 secs
Records - 84. CSV completes in 44 secs
Records - 109. CSV times out after 60 seconds with error message as at start of this thread.

I'm surprised that the export to csv process should be expected to take as long as this - and extrapolating wildly it looks like my export is doing about 3 records per second - so a 618 record export might take around 4 minutes.

Does that sound right to you?

Many thanks, David.
 

dmajwool

Member
Further Testing:-

I created a new View in the database with just the 10 fields I require for my DaDaBik page and created a new DaDaBik page using the View (rather than having DaDaBik select 10 columns from the source table with 110 columns). This way I reckoned there would be fewer redundant data flowing from the server.

Before I added the lookup instructions in the new View's forms configurator the full set of 618 records was exported to csv in a couple of seconds.

When I added the 3 lookup instructions the csv export timed out after 60 seconds.

So it seems to me that comparing today with my previous application, the csv query in DaDaBik v4 handled lookups better than DaDaBik v9.

Is this possible?

Thanks again, David.
 

eugenio

Administrator
Staff member
Hello,
I haven't tested it but the execution time might be lower, however we are talking about two very different softwares, the old one (released 8 years ago) having bugs and security issues that have been fixed in the next releases. Furthermore, V. 4 didn't provide a granular permissions model, which surely has an impact on the performances.

It's difficult right now to say which is the element that impacted more on the CSV creation performances because, again, we are talking about two very different releases.

Best,
 

dmajwool

Member
Hi Eugenio,

The comparison with v4 was very much by-the-by.

What is important at the moment is that in v9 I am unable to see a way to make an export to csv of a 600 record set of 10 columns, three of which are lookups.

Can you advise how I can achieve this within my 60 second execution time limit please.

Thanks, David.
 

eugenio

Administrator
Staff member
Hello,
I am sorry but there isn't any "trick" to improve the performance of the CSV export.

As I said, V.4 and V.9 are two quite different pieces of software in terms of security and features set so the execution time of an export to CSV might be different.

Best,
 

dmajwool

Member
Hi Eugenio,

I think it's a bit of a shame that you are not yet engaged with this problem.

I believe it's not unreasonable to expect DaDaBik to be able to process a .csv of 200 records with 3 lookups without timing out after 60 seconds.

Your response that the solution is for me to get a faster ISP service or be prepared to wait several minutes for the report to be generated is IMO not really smart thinking. Wouldn't it be better for DaDaBik to find a way to create these reports more efficiently?

...and I have. I can now make my much larger reports of 600 records with 13 lookups in about 4 seconds. I think that this order-of-magnitude difference in performance suggests that there may be scope for you to redesign your .csv creation routine and make DaDaBik a better product in the process.

Let me explain what I did (it's working fine and my users are running it without problems).

I don't know how DaDaBik calls for data during the present .csv routine and I apologise if my ignorance offends, but to test whether the problem was simply my ISP being slow I designed a View in the database that already includes the looked up values in plain text rather than in coded values.

Because this View returns the data set to DaDaBik already including the text substitutions, the DaDaBik .csv routine doesn't need to do the thing that cripples it and everything runs quickly. As I say, the csv report now completes in a few seconds.

Apart from the effort and complexity in creating and managing additional essentially redundant Views like this there are several operational drawbacks in this approach - for example the records in this View could never be editable because we are seeing the text labels rather than the underlying coded values. So it becomes necessary for my App to have one set of meuns/tables/views/forms for data entry and another set of menus/view/forms for .csv export all of which need their designs and permissions to be maintained.

However, I think I have proved the point that DaDaBik could consider developing a different paradigm that would avoid the problem in the first place and thus make my workaround unneccesary. How would it be for you if the .csv routine worked as follows:-

CSVExportButton onClick -
1. Creates a relevant SQL query dependent on the currently displayed results grid - and, crucially embeds all the lookups into this SQL(example below).
2. Creates a temporary View in the database using that Select statement.
3. Runs the csv export using the newly created Temp View.
4. Drops the Temp View from the database.

Hope this helps someone.

Cheers, David.

[pre]
SELECT
`phpbb_users`.`user_id` AS `id`,
`phpbb_profile_fields_data`.`pf_title` AS `Title`,
`phpbb_profile_fields_data`.`pf_lastname` AS `Lastname`,
`phpbb_profile_fields_data`.`pf_firstname` AS `Firstname`,
`phpbb_profile_fields_data`.`pf_emailsalutation` AS `EmailSalutation`,
`phpbb_profile_fields_data`.`pf_postalsalutation` AS `PostalEnvelope`,
`phpbb_profile_fields_data`.`pf_address_one` AS `Address1`,
`phpbb_profile_fields_data`.`pf_address_two` AS `Address2`,
`phpbb_profile_fields_data`.`pf_address_three` AS `Address3`,
`phpbb_profile_fields_data`.`pf_city` AS `City`,
`phpbb_profile_fields_data`.`pf_county` AS `County`,
`phpbb_profile_fields_data`.`pf_postcode` AS `Postcode`,
`phpbb_profile_fields_data`.`pf_telephone` AS `Telephone`,
`user_DropDownListTextValues_1`.`fld_Boolean` AS `EmailWorks`,
`user_DropDownListTextValues_2`.`fld_userStatus` AS `UserStatus`,
`phpbb_users`.`user_email` AS `Email`,
`user_DropDownListTextValues_3`.`fld_userType` AS `AccessLevel`,
`phpbb_users`.`username` AS `Username`,
`phpbb_users`.`user_interests` AS `Interests`,
`user_DropDownListTextValues_4`.`fld_Boolean` AS `CommsByPost`,
`user_DropDownListTextValues_5`.`fld_Boolean` AS `NewsletterByPost`,
`user_DropDownListTextValues_6`.`fld_years` AS `YearJoined`,
`user_DropDownListTextValues_7`.`fld_years` AS `PaidUntil`,
`user_DropDownListTextValues_8`.`fld_paymentMethod` AS `PaymentMethod`,
`user_DropDownListTextValues_9`.`fld_Boolean` AS `ShowEmail`,
`user_DropDownListTextValues_10`.`fld_Boolean` AS `ShowPhone`,
`user_DropDownListTextValues_11`.`fld_Boolean` AS `ShowAddress`,
`phpbb_profile_fields_data`.`pf_partnerto` AS `PartnerTo`,
`user_DropDownListTextValues_12`.`fld_Boolean` AS `GDPRconsent`,
`user_DropDownListTextValues_13`.`fld_Boolean` AS `ShowInterests`
FROM ((((((((((((((`phpbb_users`
JOIN `phpbb_profile_fields_data`
ON ((`phpbb_users`.`user_id` = `phpbb_profile_fields_data`.`user_id`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_1`
ON ((`phpbb_profile_fields_data`.`pf_emailactive` = `user_DropDownListTextValues_1`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_2`
ON ((`phpbb_users`.`fld_currentMember` = `user_DropDownListTextValues_2`.`fld_userCode`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_3`
ON ((`phpbb_users`.`user_type` = `user_DropDownListTextValues_3`.`fld_userCode`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_4`
ON ((`phpbb_profile_fields_data`.`pf_commsbypost` = `user_DropDownListTextValues_4`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_5`
ON ((`phpbb_profile_fields_data`.`pf_newsletterbypost` = `user_DropDownListTextValues_5`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_6`
ON ((`phpbb_profile_fields_data`.`pf_yearjoined` = `user_DropDownListTextValues_6`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_7`
ON ((`phpbb_profile_fields_data`.`pf_paiduntil` = `user_DropDownListTextValues_7`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_8`
ON ((`phpbb_profile_fields_data`.`pf_paymentmethod` = `user_DropDownListTextValues_8`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_9`
ON ((`phpbb_profile_fields_data`.`pf_showemail` = `user_DropDownListTextValues_9`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_10`
ON ((`phpbb_profile_fields_data`.`pf_showphone` = `user_DropDownListTextValues_10`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_11`
ON ((`phpbb_profile_fields_data`.`pf_showaddress` = `user_DropDownListTextValues_11`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_12`
ON ((`phpbb_profile_fields_data`.`pf_gdpr_consent` = `user_DropDownListTextValues_12`.`fld_PrimaryKey`)))
JOIN `user_DropDownListTextValues` `user_DropDownListTextValues_13`
ON ((`phpbb_profile_fields_data`.`pf_show_interests` = `user_DropDownListTextValues_13`.`fld_PrimaryKey`)))
ORDER BY `phpbb_profile_fields_data`.`pf_lastname`
[/pre]
 

eugenio

Administrator
Staff member
Hello,
first of all: I don't like the tone you are using. Personally, I would never say something like "your response is not really smart thinking" to someone I've barely exchanged a few messages with on a Web forum. I think it's the same for the vast majority of the people who write here and this is not the way we normally talk each other in this forum so please moderate your tone.

About your issue: the amount of time needed to produce a CSV depends on many factors; I don't know in details your specific case but I guess the performances of your machine are at least one element that might have an impact on what you are experiencing.

Try online the Demo 3 (ERP application): you have a similar situation, about 300 customers, two lookups (you can add another one if you want), less than 10 seconds. The machine that runs the demos is absolutely not a fancy one. I've personally tried the same demo on a old (2009) laptop machine: 6 seconds.

If you use a view instead of a table, you can increase performances. The performance gain you might get is not, however, related to the query execution time. In MySQL, if you use regular views, the execution time is quite similar to the execution time of the related query (try yourself using phpmyadmin). Explaining in details what it's actually happening using a view would take quite a lot of time.

Can the performances of the CSV creation be improved? Probably yes, as it often happens with the performances of any component of any software system. Can the performances be improved easily? Probably not as easily as you think.

As you can imagine, the work dedicated to DaDaBIK is priority-driven: in all these years, very often it happened that when lots of users asked for a new feature or an improvement, such feature/improvement has been implemented.

I will consider to work on the performance of the CSV production but at the moment decreasing the CSV production time is not a TOP priority and since the release of DaDaBIK 7.3.2 (when CSV production was improved in terms of performances), you are probably the first user who has asked for it.

Best,
 

dmajwool

Member
Hello Eugenio,

Firstly, let me apologise for the offense I have caused you. I mean no disrespect and I was clumsy with my words and your feelings. I'm sorry.

Apropos my "that is not smart thinking" comment: it rattled me that your reply to an issue (where the application practically stops working when a certain option in the front-end is chosen) was that I have a poor ISP and I need to change providers. I do believe there is something odd going on in my csv reports which cannot be explained away like this. In another way of thinking, maximising efficient use of available resources comes before masking possibly overlooked inefficiencies by burning additional processor cycles, which is why I published my workaround.

I haven't yet understood why doing lookups by way of a database View and then filtering the resulting dataset in the DaDaBIK front end should allow export of any of my 600 record datasets to .csv in 4 seconds if doing the same lookups directly on the table in the DaDaBIK forms configurator should cause the .csv export to timeout after 60 seconds when the results set exceeds around 200 records. This is a big difference. Is this something that someone on these forums can explain to me please?

I have had a look at the ERP online demo3 and I am not sure that I can do comparative testing between it and my application because a) I don't see an "Export to .csv" button and b) The potentially similar "Export to .pdf" button which I do see is limited in the demo to the first 10 records which is a smaller results set than that which I have problems with in my application.

Many thanks for your help, David.
 

eugenio

Administrator
Staff member
Hello,
about the online demo: the "export to CSV" button is available, are you sure you don't see it? If I login as "root" and I choose "customers" I see it on the left of the PDF button.

Best,
 

dmajwool

Member
Thank you Eugenio.

I can now see the export to .csv buttons in the Products, Customers and Brands pages of the Demo3 and I agree that it works smoothly. I had been looking in the Reviews page which does not show the .csv export button and it seems that I cannot enable it in the demo. What version of DaDaBIK is running that demo please, and which version of PHP?

The Pages in Demo3 with .csv buttons enabled have a modest number of records. Is it possible that we can enable the .csv button in the larger Pages so that I can see what happens with larger datasets please.

Also, can you tell me how I can inspect the SQL query that is generated when I press the .csv export button?

Thank you, David.
 

eugenio

Administrator
Staff member
Hello,
I have pointed out the customers table because it is a situation similar to the one you had problems with, you said: "process a .csv of 200 records with 3 lookups without timing out after 60 seconds. ", we have about 300 records here, two lookups and you can add a third one if you want.

For the reviews table the export to csv has been disable because it contains ONE MILLION records.

The demos always use the last DaDaBIK version available, PHP is still 5.6 but it will be moved soon to 7.x

Best,
 
Top