Book a Demo

Author Topic: Dumping a production database into a staging environment - recommendations  (Read 25848 times)

didier.pironet

  • EA User
  • **
  • Posts: 33
  • Karma: +0/-0
    • View Profile
Hi guys,
We have several Sparx environments PRD, STG and DEV all attached to Azure RDS (MSSQL)
STG is practically a copy from production but with different security settings.
Today we use Sparx Export/Import functionality but the size of PRD has increased significantly over the years and today it would take >6 hours to refresh STG, must be done outside business hours, involve many manual steps and is very sensitive to glitches.
We could go with Sparx agnostic alternatives such database replication, snapshot, cloning techniques or the classy database backup/restore.
However those alternatives have limitations, complexity involved, cost impacts and may not be supported by Sparx actually.

Questions are:
- First, are there Sparx built-in alternatives i.e. other than classy Export/Import function? Could be external tools but provided and supported by Sparx.
- If not, what are Sparx supported/validated/recommended alternatives available out there?

Thanks for your feedbacks!

Modesto Vega

  • EA Practitioner
  • ***
  • Posts: 1173
  • Karma: +30/-8
    • View Profile
Re: Dumping a production database into a staging environment - recommendations
« Reply #1 on: September 25, 2025, 07:37:27 pm »
Azure RDS could be adding additional complexity to the what you are trying to achieve.

We have a similar setup using MSSQL but hosted on Azure VMs, our equivalent of our STG environment gets refreshed from PROD every 2 weeks using a SQL Agent job, the job does not take long to execute, the job restores the latest PROD backup to the STG environment overwriting the current version.

Azure SQL Database does not support SQL Agent but Azure SQL Managed Instance does.

jvduuren

  • EA Novice
  • *
  • Posts: 2
  • Karma: +0/-0
    • View Profile
Re: Dumping a production database into a staging environment - recommendations
« Reply #2 on: September 25, 2025, 09:26:24 pm »
We do that by option "Full Project Transfer via Native XML -> Export Complete Project" under "Settings -> Model -> Transfer".
This will export all tables from the database as-is to local xml files.
Then you can import these by "Import Complete Project".
However, you need to do something manually, to change your security settings, as it will also copy the security tables as-is.
This is much faster than regular Package Export function.

PS another much faster option is to do an SQL-dump to a local file, then import these in the other stage(s). This is even much faster than the process above.

Geert Bellekens

  • EA Guru
  • *****
  • Posts: 13471
  • Karma: +571/-33
  • Make EA work for YOU!
    • View Profile
    • Enterprise Architect Consultant and Value Added Reseller
Re: Dumping a production database into a staging environment - recommendations
« Reply #3 on: September 25, 2025, 09:26:52 pm »
The built-in solution is Model Transfer. That will copy the database record per record.

But this is still a lot slower than database backup/restore. Last time I did a test, the backup/restore took a few seconds (20 seconds maximum) whereas model transfer (SQL Server db to SQL Server DB) took 1.5 hours.

I would go with database backup/restore any day.

Geert

didier.pironet

  • EA User
  • **
  • Posts: 33
  • Karma: +0/-0
    • View Profile
Re: Dumping a production database into a staging environment - recommendations
« Reply #4 on: September 25, 2025, 10:52:53 pm »
I would go with database backup/restore any day.

That's indeed much faster and resistant to Sparx glitches but preservation of specific STG security settings is a requirement and with that method we loose them...
Is there a way to restore them after a full DB restore!?


Geert Bellekens

  • EA Guru
  • *****
  • Posts: 13471
  • Karma: +571/-33
  • Make EA work for YOU!
    • View Profile
    • Enterprise Architect Consultant and Value Added Reseller
Re: Dumping a production database into a staging environment - recommendations
« Reply #5 on: September 25, 2025, 10:58:02 pm »
I would go with database backup/restore any day.

That's indeed much faster and resistant to Sparx glitches but preservation of specific STG security settings is a requirement and with that method we loose them...
Is there a way to restore them after a full DB restore!?
what do you mean by that? DB security, or EA security?

Geert

Sunshine

  • EA Practitioner
  • ***
  • Posts: 1346
  • Karma: +121/-10
  • Its the results that count
    • View Profile
Re: Dumping a production database into a staging environment - recommendations
« Reply #6 on: September 26, 2025, 11:11:52 am »
There are several options for transferring data from one Microsoft SQL Server to another, depending on your requirements for volume, frequency, latency, schema complexity, and tooling preferences. Here's a breakdown of the most common methods:
________________________________________
1. Backup and Restore
•   Use case: Full database migration or cloning.
•   How:
•   Backup the source database (.bak file).
•   Copy the file to the target server.
•   Restore it using SQL Server Management Studio (SSMS) or T-SQL.
•   Pros: Simple, preserves schema and data.
•   Cons: Not suitable for partial or incremental transfers.
________________________________________
2. SQL Server Integration Services (SSIS)
•   Use case: Complex ETL workflows, scheduled transfers, data transformations.
•   How: Create SSIS packages to extract, transform, and load data between servers.
•   Pros: Highly customizable, supports transformations and error handling.
•   Cons: Requires setup and maintenance; best for enterprise environments.
________________________________________
3. Linked Servers
•   Use case: Ad-hoc queries or small-scale transfers.
•   How: Configure a linked server in SSMS and use INSERT INTO ... SELECT FROM [LinkedServer].[Database].[Schema].
.
•   Pros: Easy to query remote data directly.
•   Cons: Performance can degrade with large datasets; security considerations.
________________________________________
4. Import/Export Wizard (SSMS)
•   Use case: One-time or simple transfers.
•   How: Use SSMS's wizard to copy tables, views, or queries between servers.
•   Pros: GUI-based, quick setup.
•   Cons: Limited control over complex logic or transformations.
________________________________________
5. Replication (Transactional, Merge, Snapshot)
•   Use case: Real-time or near-real-time data synchronization.
•   How: Set up replication between servers.
•   Pros: Automated, supports continuous sync.
•   Cons: Complex setup, requires careful planning.
________________________________________
6. Change Data Capture (CDC) or Change Tracking
•   Use case: Incremental data movement.
•   How: Track changes in source DB and apply them to target.
•   Pros: Efficient for syncing deltas.
•   Cons: Requires additional logic or tooling to move changes.
________________________________________
7. BCP (Bulk Copy Program)
•   Use case: Fast bulk data export/import.
•   How: Use bcp command-line tool to export data to a file and import it into the target server.
•   Pros: Very fast for large datasets.
•   Cons: Schema must be managed separately; not ideal for complex types.
________________________________________
8. Azure Data Factory / Synapse Pipelines
•   Use case: Cloud-based or hybrid environments.
•   How: Use ADF pipelines to move data between on-prem and cloud SQL Servers.
•   Pros: Scalable, supports many sources/destinations.
•   Cons: Requires Azure setup.


Here are strategies to manage this securely:
________________________________________
1. Use a Dedicated Transfer Account with Least Privilege
•   Create a SQL Server login or Windows-authenticated account that has:
•   Read-only access to the source DB.
•   Write-only access to the target DB.
•   This avoids privilege escalation and ensures the account can only perform the transfer task.
________________________________________
2. Avoid Transferring Security Principals Directly
•   Do not transfer logins, roles, or permissions unless explicitly required.
•   Instead, map users and roles manually on the target DB to match your security model.
________________________________________
3. Use SSIS with Impersonation or Proxy Accounts
•   SSIS allows you to run packages under a proxy account or use impersonation to access source/target with different credentials.
•   You can configure connection managers with separate credentials for each DB.
________________________________________
4. Script Data Transfer Without Security Metadata
•   Use INSERT INTO ... SELECT FROM ... or BCP/Import-Export tools to move only data, not users, roles, or permissions.
•   This ensures the target DB enforces its own security model.
________________________________________
5. Use Contained Databases (if applicable)
•   If using contained databases, users are defined within the DB itself.
•   You can script user creation separately on the target DB without relying on server-level logins.
________________________________________
6. Audit and Validate Post-Transfer
•   After transfer, validate:
•   Row-level security (if used).
•   Permissions on tables, views, and stored procedures.
•   That no sensitive data was exposed to unauthorized users.
________________________________________
7. Encrypt Sensitive Data During Transfer
•   Use Transparent Data Encryption (TDE) or Always Encrypted if transferring sensitive data.
•   For SSIS or BCP, consider encrypting files in transit and at rest.

As mentioned Sparx EA has project transfer built in but its slower and it will transfer the source security settings for Sparx EA so that may not be an option.
Happy to help
:)

Sunshine

  • EA Practitioner
  • ***
  • Posts: 1346
  • Karma: +121/-10
  • Its the results that count
    • View Profile
Re: Dumping a production database into a staging environment - recommendations
« Reply #7 on: September 26, 2025, 11:16:33 am »
In Sparx Enterprise Architect, when user security is enabled, access rights and permissions are managed through a combination of users, groups, and permissions, which are stored in specific tables within the EA repository database.
Here are the key tables involved in managing security access rights:
________________________________________
🔐 Security-Related Tables in EA's Database Schema
Table Name   Description
t_secuser   Stores user accounts, including login names, passwords (encrypted), and user status.
t_secgroup   Stores security groups that users can belong to.
t_secgrouppermissions   Maps groups to specific permissions.
t_secusergroup   Maps users to groups.
t_secpermission   Defines the available permissions (e.g., edit diagrams, delete elements).
t_seclocks   Stores information about locked elements, including who locked them and when.
________________________________________
🧩 How It Works
•   Users (t_secuser) are assigned to groups (t_secgroup) via t_secusergroup.
•   Groups are granted permissions (t_secgrouppermissions) which are defined in t_secpermission.
•   Locks (t_seclocks) are used to prevent editing of specific elements or packages unless the user has the appropriate rights.
________________________________________
🛠️ Notes
•   These tables are only populated and used when security is enabled in the EA project.
•   You can query these tables directly if you're using a shared repository (e.g., SQL Server, MySQL).
•   EA also supports Active Directory integration and Single Sign-On (SSO), which can affect how users are authenticated but still relies on these tables for authorization.

So if you want to preserve Sparx EA security on target db you need to preserve the data in those tables.
Happy to help
:)