What database is this?
If it is postgresql, try analyzing the jobs and jobqueue tables.
Karl
On Thu, Jan 21, 2021 at 3:35 AM Cihad Guzel <cguzelg@gmail.com> wrote:
> Hi,
>
> I have some performance problems. I have 28 file crawler jobs. The job
> status page is opening very slowly. Jobs slowed down as the data scanned
> increased. I share the logs as screenshots because I cannot copy the logs.
> You can see document size as follow:
>
> [image: Screen Shot 2021-01-21 at 11.18.46.png]
>
> You can see manifoldcf logs as folllow:
>
> [image: Screen Shot 2021-01-21 at 11.16.43.png]
>
> You can see postgresql logs as follows:
>
> [image: Screen Shot 2021-01-21 at 11.12.08.png]
> I tuned postgresql. you can see my postgresql.conf as follows:
>
> listen_addresses = '*'
> max_connections = 200
> shared_buffers = 1GB
> effective_cache_size = 3GB
> maintenance_work_mem = 256MB
> autovacuum = off
> datestyle = 'ISO,European'
> standard_conforming_strings = on
> work_mem = 5242kB
> checkpoint_timeout = 1h # range 30s-1d
> checkpoint_segments = 64
> checkpoint_completion_target = 0.9
> wal_buffers = 16MB
> default_statistics_target = 100
> random_page_cost = 1.1
> effective_io_concurrency = 300
>
> How can we make an improvement?
>
> Cihad Güzel
>
|
Mime |
- Unnamed multipart/related (inline, None, 0 bytes)
- Unnamed multipart/alternative (inline, None, 0 bytes)
|