|
wrong place you asking this question
|
|
|
|
|
Hello,
I have two SQL 2016 Always On nodes - (VMware Virtual Machines). Each node has 250GB RAM, 46 vCPU @2.5GHz. Each of the MS SQL VM is dedicated to 1 ESXi Host so there is no resource contention.
The application that would connect to the Database is expected to be: 70% WRITE and 30% READ.
I need advise on the best way to:
(1.) Configure MS SQL nodes to use the 250GB RAM, 46 vCPU efficiently and optimally.
(2.) Perform Write and Read as fast as possible using all the hardware resources.
(3.) I have configured Always On Read Routing, how can I test it?
Thanks.
|
|
|
|
|
|
Hi,
I am designing a system in for Java EE based web application wherein there are lots of Master Data tables. There can be new master data added or a new field may be added in the master tables.
The master tables can fetch the data from database or from webservice.
I am thinking instead of creating logic for creating each master table separately, I create some sort of code generator which does the code generation for master data management.
Essentially what does all master data have.
A view to add the master data to table
A view to update the master data
A view to list all the master data
with user actions being
Bulk upload
Search and Filter
Has anyone created something like this design for Java EE based application
|
|
|
|
|
What does the "user" need?
What you're talking about SQL Server Management Studio does "out of the box"; and it's "free".
You're going to charge for this?
"(I) am amazed to see myself here rather than there ... now rather than then".
― Blaise Pascal
|
|
|
|
|
We are creating a product. User doesnt need anything. We hat lots of master data which will dynamically change based on once we go to market. Master data like city, country, district. Basically admin screen has actions like add screen, update, listing, search.
Normal approach is to design table, create dao, have controller and service layer and then frontend screen. If a new table comes in again same thing needs to be created.
I was thinking for a way to automate creation of such code or create something which has common set of code to accomodate any master table introduced in future or update existing table.
Hope that makes sense
|
|
|
|
|
Good answer!
Using ASP.NET Dynamic Data
"(I) am amazed to see myself here rather than there ... now rather than then".
― Blaise Pascal
|
|
|
|
|
What you do is based more on what you posted here.
I can say for sure that attempts to "make my work easier" which result in dynamic/meta data solutions are always wrong. They do nothing but make the final solution MUCH harder to maintain.
That said there are numerous solutions that already exist for creating multiple layers based on actual data models. Those work when the actual concern is not a dynamic solution but rather the work involved in creating code for many data entities which is basically the same.
Depending on the solution it can create any or all of the following
1. The DDL
2. The DML - stored procs that act as an database API
3. DTOs and DAO in your language of choice
4. DAO API layers.
Myself I have been rolling such solutions myself for decades.
Only suggestion to the above I would make is that you must not allow the ease of use of the DTOs to allow you to extend their use into other layers of the application UNLESS they are free from all database hierarchy abstractions. And perhaps even then.
|
|
|
|
|
Ah this old chestnut again, every developer sooner or later wants to try and automate or pass this type of functionality to the user. As JSchell said it almost always ends in tears.
You do realise that nearly every one of us (the old farts here) has thought/attempted this type of project and yet most go back to single use code for each dimension table.
Moral of the story - don't ask the old farts for ideas, we are not happy with what we can think of and it will need a new approach to create something useful in this area.
Never underestimate the power of human stupidity
RAH
|
|
|
|
|
We have designed a 3-tier web app for a finance application.
The business tier is divided further into layers like manager, helper, util layers to modularize code and isolate the different functions i.e. core business vs non-business code from each other
Util layer has non-business functions which are required during a particular process e.g. DateUtils.java, EncryptionUtil.java etc
Helper layer has business logic which is specific to particular business process and not required in other business processes e.g. SomeThirdPartyInterestCalculationHelper.java, SpecificRequestBuilder.java
Manager layer has business process which controls the flow as well as implements some parts of business logic e.g. CustomerAccountManager.java has different methods for CRUD operations for customer account. It calls different helpers, utils, DTOs etc and gets the work done. It also implements some pieces of business logic. So,it performs mix of BPM role as well as parts of core business process logic.
As the processes become complex and lengthier, my manager layer is growing and does not look like well organized code.
I want seperate layers doing specific roles i.e. business process controller, business process execution (core business logic), CRUD operations which are DB specific, helpers (specific to processes), non-business logic
What can be a better design pattern to achieve this?
I am trying out Business Objects patterns to isolate different parts of business logic and coupling it with Application Service pattern.
So, for executing a business process, I would have:
1. ApplicationService - Would be a pure business process controller calling different business objects and controlling execution based on results of BO methods
2. BusinessObject1 - Core business logic in different methods - Called by ApplicationService
3. BusinessObject2 - Core business logic in different methods - Called by ApplicationService (if BusinessObject1 grows bigger or BusinessObject 1 and 2 can perform specific business functions)
4. IntegrationBusinessObject - To call other third party services required in business process
5. DomainEntityBusinessObject - CRUD operation for a particular domain entity required in process...will also have some business level checks required before or after CRUD operations
6. Adaptors - To convert formats for third party services - May be called by IntegrationBusinessObject
Idea is to make classes more compact and doing specific business functions. Also, control the process from a single class (Application Service) so that changing the process can be easier.
Is this design maintainable and scalable?
|
|
|
|
|
Quote: Is this design maintainable and scalable?
The "mental model" that you carry around of this particular app / system has no relation to the "user's" mental model of the business.
I can only assume you will never be actually talking to a "user"; since this is all plumbing.
The "application" programmers will love it (not).
Business processes are value "chains"; not "layers".
"(I) am amazed to see myself here rather than there ... now rather than then".
― Blaise Pascal
|
|
|
|
|
I have 3 DNN webservers behind a Citrix Load Balancer, the load balancer is configured for SSL Offloading.
I discovered that the login link doesn't work anymore. It just refreshes whenever it is clicked. The URL of the login link is: https://test.abc.net/User-Login?returnurl=%2f. The link when clicked supposed to take users to the page where they will login.
When I changed d Citrix load balancer to HTTP, everything works normal. I.e http://test.abc.net/User-Login?returnurl=%2f takes the users to the Login Page.
Any suggestion on how to resolve this issue will be appreciated.
|
|
|
|
|
Have you set the SSL Offloading header in DNN?
Setting the SSL Offload Header Value[^]
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Thanks Richard. Your suggestion saved the day.
|
|
|
|
|
Thanks Richard. Your suggestion saved the day.
|
|
|
|
|
In an imagined situation we have a file without extension. We open the file with a hex editor. Is there any chance for us to interpret what we see in hex editor and ascertain whether the file has only instruction or only data (maybe as a text, maybe in some other format).
|
|
|
|
|
|
|
Hi,
I am trying to install PyQt on Mac but I cannot make it work. I used brew to install it
1 - "brew install pyqt".
2 - "brew cask install qt-creator"
Apparently it did everything it had to do but how do I start Qt?
Many thanks, Pedro
|
|
|
|
|
|
Hi,
I've installed with Brew and navigated to /usr/local/Cellar/pyqt/.../bin and there are 3 files there, but none of them start the Qt graphical environment.
|
|
|
|
|
PyQt is a Python plugin to bind to the Qt framework. It can't be "started". It allows Python applications to use the Qt framework.
You have installed the Qt creator IDE (Integrated Development Environment). So you have to navigate to the installation directory of that and start it.
|
|
|
|
|
Ok will try that.
Brew installed pyqt, qt, python, sip and python@2. I guess then I should start it from qt directory.
I have already installed on my machine the "normal" Qt with C++. Can I use that installation and just add PyQt to it ?
|
|
|
|
|
Quote: I have already installed on my machine the "normal" Qt with C++. Can I use that installation and just add PyQt to it ? What you have probably already installed are the Qt libraries, C++ header files, and also maybe the Qt Creator.
I would guess that PyQt uses your already existing Qt libraries.
An existing Creator should be also useable when configuring it for PyQt.
|
|
|
|
|
The problem is that brew puts everything in a directory called Cellar and installs everything there even if you have the program installed somewhere else.
For example, I have installed python on my Mac, nevertheless brewer installed it again under Cellar. Quite strange I think! So, given this, how can I "force" pyqt to use my original qt installation ?
|
|
|
|