简体   繁体   中英

Multiples Connections to Oracle - Multi Task

I have a console application that I must generally speaking check the entire base to do stuffs ...

To do so, I am using Tasks like this:

static void Main(string[] args)
{
        var dateStart = DateTime.Now.AddDays(-35);
        var dateEnd = DateTime.Now;

        var taskList = new List<Task>();

        while (dateStart > dateEnd ? dateStart >= dateEnd : dateStart <= dateEnd)
        {
            var d = dateStart.Date;
            var dispositivesBll = new DispositivesBll();
            taskList.Add(Task.Run(() =>
                                        {
                                           dispositivesBll.Foo(d);
                                        }).ContinueWith(
                                            x => dispositivesBll.Dispose())
                               .ContinueWith(x => GC.Collect()));

            var dispositivesBllNew = new DispositivesBll();
            taskList.Add(Task.Run(() =>
                                        {
                                            dispositivesBllNew.Boo(d);
                                        }).ContinueWith(
                                            x =>
                                            dispositivesBllNew.Dispose())
                               .ContinueWith(x => GC.Collect()));

            if (taskList.Count >= 2 * 5)
            {
                Task.WaitAll(taskList.ToArray());
                taskList.Clear();
            }
            dateStart = dateStart > dateEnd ? dateStart.AddDays(-1) : dateStart.AddDays(1);
        }
        Task.WaitAll(taskList.ToArray());

So Basically I want to run 10 days at once as you may noticed at if (taskList.Count >= 2 * 5) but the problem is that my Foo and Boo methods have multiples connection to one Oracle Database.

public class DispositivesBll : IDisposable 
{
    private readonly OracleDal _oracleDal = new OracleDal();

    public void Foo(DateTime data)
    {

        var t1 = Task.Run(() =>
                     {
                         _listSuccess = _oracleDal.GetSuccessList();
                     });

        var t2 =
            Task.Run(() =>
                         {
                             listFailure = _oracleDal.GetFailureList();
                         });

        t1.Wait();
        t2.Wait();

        foreach (var success in _listSuccess)
        {
            //Some logic to insert objects into a "mergeList"
        }

        if (mergeList.Any())
            Task.Run(() => _oracleDal.MergeList(mergeList)).Wait();
    }

    public void Dispose()
    {
        if (_hash != null)
            _hash.Clear();
        _hash = null;

    }
}

and my Merge Method:

    public void MergeList(List<MyObject> mergeList)
    {
        using (var conn = new OracleConnection(Connection.ConnectionString))
        {
            if (conn.State != ConnectionState.Open)
                conn.Open();
            using (var oCommand = conn.CreateCommand())
            {
                oCommand.CommandType = CommandType.Text;
                oCommand.CommandText = string.Format(@"
                MERGE INTO MyTable dgn 
                USING (select id from another_table where field = :xpe) d ON ( TO_CHAR(dateHappen, 'DDMMYYYY') = {0} and id = :xId) WHEN MATCHED THEN 
                    UPDATE SET OK = :xOk, dateHappen = SYSDATE
                WHEN NOT MATCHED THEN 
                    INSERT (fields....) 
                    VALUES (values...)");
                oCommand.BindByName = true;
                oCommand.ArrayBindCount = mergeList.Count;



                oCommand.Parameters.Add(":xId", OracleDbType.Int32,
                                        mergeList.Select(c => Convert.ToInt32(c.Id)).ToArray(), ParameterDirection.Input);

                oCommand.Parameters.Add(":xPe", OracleDbType.Varchar2,
                                        mergeList.Select(c => Convert.ToString(c.Xpe)).ToArray(), ParameterDirection.Input);


                oCommand.ExecuteNonQuery();
            }
        }
    }

The problem is: For each "day" it tooks about 2 hours to process everything... and we have a daily plan to backup our database causing the database to stop about 10 minutes... so it would cause a lock in my process...

So what I do? I stop manually this process and start it again avoiding the dates already executed. BUT if I have 20 connections opened, they would stayed that way... So I have to kill those sessions everytime... Is there a way to force all connections to dispose?

EDIT:

MyTable has 50 mi rows.... composed by ID | STATE | DATE ID | STATE | DATE ID | STATE | DATE ... basically I have to cross those STATES with their DATES... So the big delay it's on the database... It's kinda a known issue that we have to refactor the entire database model.... and we are going to do it soon...

But anyways, despite the process time, if I can just manage (or force) the connection kill, it would be fine...

Any ideas?

Okay, in order kill a session, you would need to involve a DBA, which if you do several times a week, would not make for a good friendship! As myself and others pointed out, a database should not need to be brought down for a backup (except for a cold backup), or for even a export, but if you have these long merge processes a consistent export would take quite a while. So first step is to educate (nicely) the DBAs on having the database in archivelog mode, and have RMAN manage online backups.

Second, you need to improve the merge criteria so that the merging columns are indexed. It could well be that dateHappen is indexed, but since you are invoking a function on it in the merge criteria, that index cannot be used unless a function-based index is created. I am referring to the

TO_CHAR(dateHappen, 'DDMMYYYY') = {0}

specifically; and in general,

USING (select id from another_table where field = :xpe) d 
ON ( TO_CHAR(dateHappen, 'DDMMYYYY') = {0} and id = :xId) 

You should check to see if id and field on another_table is indexed, and create a function-based index on dateHappen:

create index i_date_string on whatever_table (TO_CHAR(dateHappen, 'DDMMYYYY'))

For your database tuning, just invoke the merge statement by itself in a SQL tool to try different approaches; then you won't have to worry about killing merges half-way, whicn would be causing a lot of work in the database rolling back transactions, etc.

Update:

OK, I will answer the question instead of providing my solution. :)

You can create a profile that is then assigned to a user to limit sessions to a specific CONNECT_TIME or IDLE_TIME or both. Then if the user exceeds the time, according to http://docs.oracle.com/database/121/SQLRF/statements_6012.htm#SQLRF01310

If a user exceeds the CONNECT_TIME or IDLE_TIME session resource limit, then the database rolls back the current transaction and ends the session. When the user process next issues a call, the database returns an error.

So you can do this:

create profile MERGE_PROF limit idle_time 5 connect_time 86400;
alter user BATCH_MERGER_USER profile MERGE_PROF;

Then when your session is created, if the session is idle for 5 minutes, Oracle will kill the session, and if it is running solid for 24 hours, (running commands with less than 5 minutes between commands for 24 hours) Oracle will kill the session.

The technical post webpages of this site follow the CC BY-SA 4.0 protocol. If you need to reprint, please indicate the site URL or the original address.Any question please contact:yoyou2525@163.com.

 
粤ICP备18138465号  © 2020-2024 STACKOOM.COM