List loading extremely slow when with join applied

OK, well I went on to the dev server, set that my_concelhos 'id' to be the PK, and that index.php/pt/profissionais-saude/candidaturas page now loads in about the same time it does on my local server.

-- hugh
 
Hugh,

Correct... I've tested and it's ok!

Why can I not achieve the same in the main site? Can you please just check if I did miss something?
 
Looks like it should be OK, but isn't. And even with the memory_limit at 1024M, it still runs out if I try and enable J! debug, so I can't get a profile to see where the time is spent.

Which it doesn't do on either your dev site or mine. That page only uses about 35MB of memory. But on your live site, with debug enabled, it's hitting a gigabyte.

So obviously something is spinning out of control, but I have no clue what.

Have you made any changes to the site since I took the Akeeba backup I'm using?

-- hugh
 
Probably some minor changes.

I did another Akeeba Backup. One thing my backups don't have concerning the list candidaturas is the uploaded files because the backup would be huge.

If you could please test on your side this backup probably you would find what is messing up with the loading times.
 
Last edited:
Did you do anything that would change the file system, or could I just do a database backup? It makes quite a difference to the time - if I have to do another full site transfer, I have to blow away my test site and recreate it from scratch with Kickstart, and building another PHP Storm project on it - so that's another 30 minutes of billable time. If it's just database, I can just run the sql file in Navicat, then reset the connection password in Fabrik, and it's done - 5 minutes.

-- hugh
 
OK, I loaded up the new database, and that page still loads in a couple of seconds, no obvious bottlenecks, and it loads fine even with J! debug enabled.

So what possible differences might there be between your live server, and my test server / your dev server?

-- hugh
 
Hugh,

I truly don't know. But I can give you access to my Plesk for you to analyse. What do you think about it?
 
I don't think Plesk is going to help.

BTW, I just noticed an error on this page:

tratamentos-fisioterapia/entregar-doc

filter query error: tipo_doc Subquery returns more than 1 row

... which is due to this part of the query ...

Code:
if (951 in (2133,922,502,499),

'%'
,
concat('%',(SELECT group_id
FROM flar_user_usergroup_map
WHERE user_id = 951 and group_id <>'20'),'%'))

... which comes from the WHERE clause on the join element, ID 1327.

-- hugh
 
OK, the related data on Pacientes is fixed - had to trash that copy of the Tratementos list. (Forgive my spelling if I got those wrong)

But still no idea on why that other list is still slow on your live site.

-- hugh
 
My development server is the same server of my live site only in a different folder. So it's strange the difference in the loading times.

Once again if there is something I could do to help... Please advise.
 
Well, one thing I notice is that it runs fast on the back end. The slowness is only on the front end.

I'm about as sure as I can be that this isn't a Fabrik issue, but I'm currently at a loss as to what it could be.

As I can't replicate the problem on my site or your test site, about the only thing left I can do is to start putting in debug code on your live site to try and track it down.

-- hugh
 
Hugh,

I'll do a new backup.

Front is the way... Now is the perfect time once we are now closed.
 
Last edited:
OK. Do you want to close the site? I'm going to have to put debug breaks in places that will kill all your lists.

BTW, according to FileZilla, the certificate on your ftp server has expired.

-- hugh
 
Ok... It'closed, I suppose it was you!

About the certificate what are the concerns about it being expired? when you refer to certificate you are referring encryption like https but on the server?
 
OK, it's your php_events code in onMakeFilters. The reason it doesn't slow down on my server or your test server, I assume, is because those arquivo/candidaturas_cv/* files don't exist. And I'm guessing there there are a LOT of files in that folder ... and your code is executing a database query for every single one of them.

It's hard to figure out what that code is doing, but as far as I can tell, it's trying to delete any files which don't exist in the fb_candidaturas table.

If so ... it's doing it in a VERY inefficient way.

First thing ... this kind of "garbage collection" shouldn't be done during a page load where it can impact the 'user experience". Stick it in a cron task.

Second ... don't fire off a query for every time round that loop. Do one query, before the loop, that selects the cv from all rows, something like ...

$query->select('cv')->from('fb_candidaturas');
...
$cvs = $db->loadColumn();

... then in the loop just check to see if it exists ...

if (in_array($cv, $cvs)) {
...
}

Also, best to use J!'s folder and file API, like JFolder::files() and JFile::exists().

-- hugh
 
Well... problem found. Thank you for your persistence.

There are code to eliminate not existing files and to rename existing ones. BTW I think you should have for the upload element the possibility to rename files using specific text with placeholders like record id or others.

About eliminating files I'm using that code because the element is not eliminating the file(s) of an specific record that we've just eliminated .
 
About eliminating files I'm using that code because the element is not eliminating the file(s) of an specific record that we've just eliminated .

You mean when you delete a row from the list, or when you hit the "Delete" button on the upload element in the form?

-- hugh
 
We are in need of some funding.
More details.

Thank you.

Members online

Back
Top