Hi,all
I've a large amount of real time data need to be proceed as fast as possible.
This data is coming from multiple threads over network connections.
All network threads pass the data to a shared function to process it with some translation and interpretation, after that it saves the information into Concurrent Dictionary object by object.
The problem is I have an amount of objects that reaches 150K stored in this dictionary.
What happens is while fetching the object to update, it takes a long time rather than the accepted time.
public class MyObject
{
System.Timers.Timer LostTimer = new System.Timers.Timer();
public int ID;
public DateTime UpdateTime;
public MyObject()
{
LostTimer.Interval = 20000;
LostTimer.Elapsed+=TimerElapsedHandler(LostTimer_Elapsed);
LostTimer.Enabled = true;
}
void LostTimer_Elapsed(object sender,EventArgs e)
{
if(UpdateTime > DateTime.Now.AddSeconds(-20))
Console.WriteLine(ID + " Lost...");
}
}
public class MyClass
{
public MyClass(){}
private ConcurrentDictionary<int,MyObject> Objects = new ConcurrentDictionary<int,MyObject>();
void NetworkThread1DataRecived(eventArgs e)
{
Translate(e.Data);
}
void Translate(string[] data)
{
Task.Factory.StartNew(()=>
{
Parallel.ForEach<string>(data, s (()=>
{
MyObject o = null;
Objects.TryGet(int.Parse(s),out o)
if(o == null)
{
o = new MyObject();
o.ID = int.Parse(s);
o.UpdateTime = DateTime.Now;
Objects.TryAdd(s,o);
}
else
{
o.UpdateTime = DateTime.Now;
}
});
});
}
}
Now when working with more than 30K of objects it gives me objects lost.
Please I need help urgently.
Thank you!