Author Topic: Yet Another Cloud Services Question  (Read 1487 times)


  • EA User
  • **
  • Posts: 38
  • Karma: +0/-0
    • View Profile
Yet Another Cloud Services Question
« on: February 12, 2016, 06:23:29 am »
I'm trying to update a shared repository located on a remote server via Sparx' Cloud Services. I work in a local repository, export my package and then import it into the remote repository. The repository is a representation of a company's enterprise business processes and value streams. As such, it contains multiple nested levels of packages, with each nesting representing a level of business process decomposition.

This is something I stumbled upon yesterday:
EA reported timeout errors each time I tried to import any package into a nested level of the shared repository - picture packages within packages within packages. I experimented with importing small maps yet the problem persisted.

Then, I tried importing a package that had repeatedly failed to import into a nested package on the server into the top level of my server-based repository. Lo and behold! The import was successful, ending months of being stymied by this problem.

Can someone provide me with any insight as to what might be happening? Nesting packages in multiple levels makes no difference when inporting a package into a local repository - or when someone local to the server imports the same package to its nested location - but makes a huge difference when importing using cloud services.

In researching this topic on this forum, I found this from Geert:

The EA client is very "chatty" with the database server. It sends up to hundreds of small sql queries to the database to get every little piece of information.
So it is not so much the bandwith as the response time that will be the bottleneck when dealing with remote clients.

The only real alternative is to use a central version control system. Then each user can have its local model, and they check-out the parts of the model they want to work on.

Checking in/out takes a while too, but as long as you keep the size of the packages small enough it is a workable solution.


I'm thinking one of the contributing factors may be network latency, although I cannot prove this. I do know the campus where I work has a 1GB fiber pipe to the Internet and supports a thousand or more people. I tested my upload speed and found it to be 2Mb/sec at the time I tested; download speed was about 12x faster. I also know people stream video to their desktop, too, which may have some effect on what I'm trying to accomplish.

By the way, I can export successfully the whole remote repository, which becomes a 100MB XML file, with no timeout issues.

I appreciate any comments / insights.

« Last Edit: February 12, 2016, 06:27:34 am by SilverSage »