Click here to Skip to main content
14,575,802 members
Articles » Web Development » ASP.NET » General » Statistics

Creating Custom Web Crawler with Dotnet Core using Entity Framework Core and C#

Rate this:
4.89 (5 votes)
Please Sign up or sign in to vote.
4.89 (5 votes)
24 Feb 2019CPOL
There are most of web scraping and web crawler frameworks existing on different infrastructures. But when it comes to dotnet environments, you have no such option that you can find your tool that accommodates your custom requirements.

Views

License

This article, along with any associated source code and files, is licensed under The Code Project Open License (CPOL)

Share

About the Author

Mehmet Ozkaya
Technical Lead
Turkey Turkey
• 10+ years hands on experience in Microsoft .NET Development. Experience in Web Application Development using C#, .NET Core, ASP.NET/MVC/ASP.NET Core, JavaScript, Angular, TypeScript
• Strong experience Design and implement RESTful APIs JSON design, SOAP Web Services using WCF and ASP.NET Core Web API
• Excellent knowledge on Object-Oriented Design and Development, SOLID principles, DDD, Design Patterns, SOA, TDD and Unit Testing.
• Strong experience using Web Front End Technologies such as DOM, HTML5, CSS3 and JavaScript, JSON, JQuery, TypeScript
• Experience using ORM tools such as Entity Framework Core, NHibernate, Dapper.
• Good knowledge of Distributed Software Architecture and Messaging Technologies in Microservices Architecture, API Gateways (Orleans), BDD, DDD/CQRS patterns, Distributed Message Broker Service Bus Applications(MQTT, AMQT, MassTransit, RabbitMQ, Azure Service Bus), Event Sourcing, Cloud Computing Architectures, IAAS platforms (AWS, Azure), Application Containerization, Docker.
• Using Distributed Caching and Related Open Source Tools such as Redis, Camunda, Castle Windsor, AutoMapper, Log4Net.

GitHub : mehmetozkaya

Comments and Discussions

Stats

11.6K views
10 bookmarked