How can I handle a huge amount of logging generated from frequent table rebuilds?

We have several tables which we "refresh" frequently by rebuilding them in a staging-table, then performing a metadata-switch to the production-table using the ALTER TABLE <source> SWITCH INTO <target> statement.

These operations are causing a huge amount of logging, and we're not sure how to handle it. For now we just moved these tables and operations to a new database using the SIMPLE recovery model, but I would love to hear alternative solutions to this problem.

How do you suggest we handle the massive log generation as a result of frequent table rebuilds?

Replay

one option (if feasible for you considering the downside) would be to change the database to bulk logging and bcp into the target table from a view reading the source tables. the actual logging should be minimized, but the log backup will still be huge as it will include the changed extents. a downside of bulk logging is that is compromises the ability to point in time recover to a time frame during the bulk logging operation. make sure you take a log backup immediately after the bulk import

Category: sql server Time: 2016-07-29 Views: 0

Related post

iOS development

Android development

Python development

JAVA development

Development language

PHP development

Ruby development

search

Front-end development

Database

development tools

Open Platform

Javascript development

.NET development

cloud computing

server

Copyright (C) avrocks.com, All Rights Reserved.

processed in 0.191 (s). 12 q(s)