|
Points well taken. Thanks for looking it over.
Richard Deeming wrote: Purists frown on it,
So I've heard.
What I gather is that if you return an IQueryable you'll sooner or later end up with queries in places OTHER than your data layer. Should you need that, then create a method on the DAL to handle it. I tend to agree with your point as well as the research I saw, so I don't really have a preference.
Other than that, would you say my overall design is solid?
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
I think there's still room for improvement.
Your classes are currently tied to specific implementations, which would make it harder to test them.
Perhaps try something like this:
interface IUnitOfWork : IDisposable
{
void Save();
Task SaveAsync();
}
class FalconContextUnitOfWork : IUnitOfWork
{
private readonly FalconContext _context;
public FalconContextUnitOfWork(FalconContext context)
{
_context = context;
}
public void Save()
{
_context.SaveChanges();
}
public Task SaveAsync()
{
return _context.SaveChangesAsync();
}
public void Dispose()
{
_context.Dispose();
}
}
interface IBizObject : IDisposable
{
void AddProject(Project entity);
}
class BizObject : IBizObject
{
private readonly IProjectsRepository _projectsRepository;
private readonly IProjectsTaskRepository _projectsTaskRepository;
private readonly IUnitOfWork _unitOfWork;
public BizObject(IProjectsRepository projectsRepository, IProjectsTaskRepository projectsTaskRepository, IUnitOfWork unitOfWork)
{
_projectsRepository = projectsRepository;
_projectsTaskRepository = projectsTaskRepository;
_unitOfWork = _unitOfWork;
}
public void AddProject(Project entity)
{
if (_projectsRepository.FindProjectsByName(entity.ProjectName).Any())
{
throw new DuplicateProjectException(entity.ProjectName);
}
_projectsRepository.Add(entity);
foreach (var task in entity.Tasks)
{
_projectsTaskRepository.Add(task);
}
_unitOfWork.Save();
}
public void Dispose()
{
_unitOfWork.Dispose();
}
}
public class ProjectsController : ApiController
{
private IBizObject _bo;
public ProjectsController(IBizObject bo)
{
_bo = bo;
}
public IHttpActionResult AddProject(Project entity)
{
_bo.AddProject(entity);
return Ok();
}
protected override void Dispose(bool disposing)
{
base.Dispose(disposing);
if (disposing) _bo.Dispose();
}
}
Register all of the concrete implementations as transient services in your DI container. You should then get a new instance of all three classes for each request, but sharing the same instance of your DbContext between them.
In your unit tests, you can pass specific mocks of the interfaces to the constructor of the class you're testing.
Implementing the Repository and Unit of Work Patterns in an ASP.NET MVC Application (9 of 10) | Microsoft Docs[^]
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
I see. I think I have it
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
Richard Deeming wrote: Register all of the concrete implementations as transient services in your DI container
I think I need you to explain this some more.
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
|
I think I'm still confused about this. The example page you linked doesn't fill in all the gaps. Do you know of a better example? I'd really like to see something with all the pieces in place.
Thanks
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
As I look at the example code you provided, I'm concerned about passing the repo's into the BO's CTOR. There could be dozens of repos, or even hundreds - one for every table.
Or maybe I'm just really confused.
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
Your DbContext will be registered as a "scoped" service - only one instance will be created for each request.
The repositories can either be "scoped" or "transient", where a new instance is created each time it's needed. Since they don't store any state other than the context, I don't see any need to make them "scoped".
The same applies to the IBizObject service.
When a request reaches your controller, the DI container will:
- Create or retrieve the
FalconContext for the current request; - Create a new instance of the
FalconContextUnitOfWork and repository classes, passing in the context; - Create a new instance of the
BizObject class, passing in the repository and unit-of-work instances; - Create a new instance of your controller class, passing in the
BizObject instance; - Invoke the controller action;
- Clean up the transient / scoped instances for the request;
Service lifetimes and registration options | Dependency injection in ASP.NET Core | Microsoft Docs[^]
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
I have another question on this. yes, I'm still confused. I'm reading a lot of articles and there's no end to the ways this is done.
Unit of Work I understand that EF already IS a Unit of Work. But I see a Unit of Work as a class that handles data operations for some task. For example, an Inventory system might have a Unit of Work that handles a customer purchasing something, which might do the following:
- Check customer credit limit
- Create an invoice header and details
- Call the Inventory portion to ensure handle inventory issues
- Call the shipping portion
In my mind all of these are 'housed' in one class called PlaceCustomerOrderUOW, and this class lives in the DAL. For every thing that needs to be done that is data -related, there would be a UOW to encapulate all the related work items together.
These UOW's would be the classes that actually create the repos, and the repos are the only classes that know about the DBContext.
So in my example above, the CustOrderController calls BizOb.PlaceCustomerOrder, which calls DAL.PlaceCustomerOrder, which creates the UOW above
The BizObject is really for ensuring business rules are applied - for example, check the customer's credit limit, check user rights, etc.
I'm curious what you think of this design.
Thanks
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
Kevin Marois wrote: These UOW's would be the classes that actually create the repos
I still think you should be passing the repo instances in via the constructor. In the test environment, you need to be able to pass mocked instances in, so that you can test just the logic of the class, rather than also testing the concrete repo implementation.
In the live environment, the DI container will take care of creating the instances for you and passing them to the constructor.
If your class needs precise control over when the repo instance is created, then pass in a "factory" interface, with a method to create the repo instance.
Otherwise, sounds good.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
With Linq To SQL you created POCOs and passed them to the DAL, then copied the data to the entity and saved it. The DAL was hidden from the reset of the app and only needed the POCOs.
In EF, the entities are generated for you. If you use those entities, you are coupling the DAL to whatever else calls it. You're forced to reference the DAL in the UI so that you can populate and/or bind to entities.
I know you can generate the entities into another location, such as a common project, but it's really a pain to get working and maintain.
I'd like to hear what other think about this?
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
|
Entities are typically defined in the "Model" folder / project.
The ORM and the app reference the "model".
Regardless of where or how the entities are created, they can all be extended via "partial" classes; with extended properties tagged as "non-db" as appropriate; keeping both the ORM and the app happy.
"(I) am amazed to see myself here rather than there ... now rather than then".
― Blaise Pascal
|
|
|
|
|
If the models are generated in the DAL project and you want to use a model in the UI, then you have to reference the DAL project in the UI project. Extending the class doesn't do anything to fix this.
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
Your "DAL" (i.e. db context / repository) can apply to multiple models / entities.
If it is included in with a particular model, it is either a conscious design decision (due to scale), or a design mistake. (You can control namespace generation).
Extending "Model / DAL" classes allows you to mix "code-first" and "database first", and not worry about who "owns" the entity (ORM, App, or "common").
Or, use DTO's with your POCO's.
Data transfer objects are "legit" in many cases; as long as they don't include logic. More coding though.
(The DAL / Model mixed bag also made things messy when trying to maintain a clear separation between client and ASP.NET server; when both "shared" the same "entities"; but NOT DATA ACCESS METHODS).
"(I) am amazed to see myself here rather than there ... now rather than then".
― Blaise Pascal
modified 29-Jun-18 18:20pm.
|
|
|
|
|
Non-trivial applications that are actually used in businesses for any length of time will always end up tied to many things.
It is possible and difficult to reduce coupling to very low levels. HOWEVER, doing that introduces code complexity and is unlikely to lead, in totality, to reduced business costs. That is because of the many subsystems it isn't likely more than a few, if any, will ever need to pivot to another solution. And because it is difficult getting it right and maintaining that over many iterations and many subsystems, it will likely lead to increased costs due to the additional complexity.
|
|
|
|
|
We don't use EF for a number of reasons that are not relevant here. We use WCF to service the data as no client is allowed to talk to a database so our structure is:
WCF Service - connects to the database and exposes an interface of the DAL operations for the project, references the Models project.
Models - Representation of the database tables, views and SP output. Implements OPC on the fields.
UI - Presentation layer - service reference to the WCF and references the Models project.
When we move the a Web API service the models project will be exclusive to the UI and the service will serve up Json data.
It really helps that we have a code generator that builds the CRUD stored procedures, wcf DAL, models, UI grids and dialogs. I'm a lazy bastard and only do code for customised presentation of which there is plenty.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
I read this article about a basic repository pattern. I understand that a repository would implement
something like this:
public interface IRepository<TEntity> where TEntity : class
{
TEntity Get(int id);
IEnumerable<TEntity> GetAll();
IEnumerable<TEntity> Find(Expression<Func<TEntity, bool>> predicate);
void Add(TEntity entity);
void AddRange(IEnumerable<TEntity> entities);
void Remove(TEntity entity);
void RemoveRange(IEnumerable<TEntity> entities);
}
The author also describes a Unit of Work, which is a class that encapsulates some work to be done and then finalizes all data saving at one time:
(in this example, the "PlutoContext" is an EF data context)
public class UnitOfWork : IUnitOfWork
{
private readonly PlutoContext _context;
public UnitOfWork(PlutoContext context)
{
_context = context;
Courses = new CourseRepository(_context);
Authors = new AuthorRepository(_context);
}
public ICourseRepository Courses { get; private set; }
public IAuthorRepository Authors { get; private set; }
public int Complete()
{
return _context.SaveChanges();
}
public void Dispose()
{
_context.Dispose();
}
}
The UnitOfWork class wraps up calls to the repositories for a specific functionality into one class.
1) Would you use this unit of work idea? Why not just make methods on a DAL class to this this, rather than specific IUnitOfWork implementations? I guess you could define specific units of work classes that wrap a specific functionality, but these have to live somewhere and just seem like an additional layer of abstraction that probably isn't needed. What do you think?
2) From what I can see, EF is already made up of units of work. You can do Inserts, Deletes, and Updates all at one time, then call Save. If so, this sounds like a UnitOfWork conecpt isn't needed for EF.
3) When EF queries data, how does it get the data into the Entities? Is it using ADO under the sheets and simply copying the data from a DataTable or DataReader to the entity's properties?
4) Does EF keep track of EVERY property changed, then write out only those changes? Or when Save is called is the ENTIRE entity written to the DB?
Thanks!
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
modified 29-Jun-18 12:11pm.
|
|
|
|
|
|
The whole reason for posting questions here was to avoid reading thousands of lines of code
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
Just making sure you had the resource handy. If it were me, and I really wanted to know, I'd go look rather than rely on what some random dude(tte) on the internets had to say about it
"There are three kinds of lies: lies, damned lies and statistics."
- Benjamin Disraeli
|
|
|
|
|
I already have the code. Not sure if you've ever seen it, but it's not trivial
If it's not broken, fix it until it is.
Everything makes sense in someone's mind.
Ya can't fix stupid.
|
|
|
|
|
If you only have a "few" entities, then one repository per entity is overkill IMO.
I wrap the whole DB context in a "repository" and tailor it to the "problem"; accessing specific entities "by name", like GetCustomer(id); instead of resorting to "patterns and generics" that make the whole thing more obtuse.
"SaveChanges" saves the whole "context"; that's why a "centralized" save makes no sense. For "big" object graphs and "batch jobs", one wants to save at appropriate intervals to avoid one (big) update stream.
"Hanging" onto db contexts for a "long time" is also a problem (and bad habit) for most.
Creating new db contexts is "light-weight", and helps avoid memory leaks. The "data base calls" are what impacts performance.
The actual number of "unique" calls that one makes to an ORM is rather small and "self-documenting" code beats generics when you start solving "application problems" and having to apply "user terminology" change requests.
Generics are fine for "tools"; not business logic; and "use case" ORM queries, etc. fall under "business logic".
(EF gets bad-mouthed a lot because of all the "patterns" others want to apply and the (bad) results you wind up having to contend with; not due to some inherent defect in EF).
Writing "fewer" lines of code is not necessarily progress; or without future "debt".
"(I) am amazed to see myself here rather than there ... now rather than then".
― Blaise Pascal
modified 29-Jun-18 16:45pm.
|
|
|
|
|
We have an Oracle data wharehouse that I want to set up a web API service to allow authorised applications to query, note users will not have access to the API only applications.
I would ideally like to have an environment where a developer can build and test a query, construct that as a DLL that conforms to a defined structure and drop the file into a folder which the API can find it.
Setting up an interface to service all queries should be fairly simple, query string and a dictionary or tuple of parameters. Then have the API deliver a JSon result.
I want to know if MEF will meet these requirements.
Assuming MEF will do this and the web API only accepts authorised, via token, requests are there any security flaws in the design.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
Yes, MEF will work very well for this purpose. I've built similar systems before, and key sticking points for me were:
- Use a good standards-based query approach (like REST) so that you can build a concise base class for the DirectoryCatalog to pick up via [InheritedExport]. Trying to get fancy with specific method location (such as Dictionary<string,func<iowinrequest,object>> ControllerActions) will lead quickly to abstraction bloat and maintenance nightmares.
- Make sure you have your authorization model de-coupled from the controller (though metadata attributes can be used to provide this in concrete implementations). This may work with WebAPI2 out of the box, but I'm not 100% positive on that count (I used a custom OWIN web server for my implementation).
- If you're using an ORM, either make sure to resolve objects before they leave your controller, or make your data context a part of a request object that is passed to your controller actions.
Those are the biggies that have cause me tears over the years.
"There are three kinds of lies: lies, damned lies and statistics."
- Benjamin Disraeli
|
|
|
|
|