[Bucardo-general] Problem when copying big table

Levani Gventsadze lgventsadze at oroinc.com
Sat Dec 25 05:30:37 UTC 2021


Hello,
Thank you for your response.
I will need to dive in the logs and get more than these, but I doubt it will help, because the regular method (dumping the data and then importing into the database ) works without problem.

Regards,
Levan

From: Jon Jensen <jon at endpointdev.com>
Date: Friday, 24 December 2021 at 23:29
To: bucardo-general at bucardo.org <bucardo-general at bucardo.org>, Levani Gventsadze <lgventsadze at oroinc.com>
Subject: Re: [Bucardo-general] Problem when copying big table
Levani,

Do you have earlier logs prior to this point? It looks to me like your
ERROR: cited here is a continued transaction error state that began
earlier, and this COPY is not your actual problem. Are there other ERROR:
lines in your log prior to this?

In any case I would not expect an INSERT to work where a COPY doesn't
work. They should behave the same.

Jon


On Fri, 24 Dec 2021, Levani Gventsadze wrote:

> Hello all,
> We are having an issue with bucardo v5.6 when copying big database table.
> The problem gets aborted without any useful information in the log ( we tried increasing the log level).
> Here’s what we get every time we try to copy that big table:
>
> < 2021-12-23 10:17:28.860 UTC > CONTEXT:  COPY oro_product, line 521577
> < 2021-12-23 10:17:28.860 UTC > STATEMENT:  /* Bucardo 5.6.0 */COPY public.oro_product("id","organization_id","business_unit_owner_id","primary_unit_precision_id","brand_id","inventory_status_id","attribute_family_id","sku","sku_uppercase","name","name_uppercase","created_at","updated_at","variant_fields","status","type","is_featured","is_new_arrival","pagetemplate_id","category_id","taxcode_id","manageinventory_id","highlightlowinventory_id","inventorythreshold_id","lowinventorythreshold_id","minimumquantitytoorder_id","maximumquantitytoorder_id","decrementquantity_id","backorder_id","isupcoming_id","availability_date","serialized_data","book_type_id","bn_average_rating","bn_audience_age_from","bn_audience_age_to","bn_dimension_depth","bn_dimension_height","bn_dimension_weight","bn_dimension_width","bn_retail_price","bn_bisac_format","bn_dimension_unit","bn_dimension_weight_unit","bn_tax_id","bn_company_name","bn_display_edition_description","bn_url_keywords","bn_publication_date"
 ,"bn_author_bio","bn_edition_number","bn_image_version","bn_number_of_pages","bn_work_id","bn_discountable_flag","bn_large_print_ind","bn_shippable_flag","bn_audience_id","bn_language_desc_id","bn_display_format_id","bn_parent_format_id","bn_lexile","bn_lexile_value","bn_series_id","bn_series_number","bn_series_title","bn_contributors") FROM STDIN
> < 2021-12-23 10:17:28.861 UTC > ERROR:  current transaction is aborted, commands ignored until end of transaction block
> < 2021-12-23 10:17:28.861 UTC > STATEMENT:  DEALLOCATE dbdpg_p6534_2
>
> We see this COPY command is utilizing the Postgres COP, is there a way to use INSERT instead (I am not sure if that will help though).
>
> Does anyone have any ideas to help us identify the problem and try to solve it?
>
>
> Thank you in advance.
>

--
Jon Jensen
End Point Corporation
https://www.endpointdev.com/
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://bucardo.org/pipermail/bucardo-general/attachments/20211225/91239973/attachment.htm>


More information about the Bucardo-general mailing list