|
Depends on what you want to do?
- Want a developer-friendly cloud with good documentation/community support? Go with Azure.
- Want a consumer-friendly cloud with everything ready as-an-API? Go with Google Cloud Platform.
- Want a cloud platform in the Asia-pacific market? Go with Alibaba Cloud (sometimes I feel like it is Azure but for Asian markets).
- Want just a hosting platform and a database set? Go with DigitalOcean.
- Want to get stuck reading documentation/creating support tickets/learning that the issue you are facing in 2020 was reported back in 2015 but the platform decided never to solve it and moved on with marketing and sales campaign? Go with AWS.
I hope this helps.
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|
|
Sure!
Azure: Developer-friendly with strong support.
Google Cloud Platform: Consumer-friendly with extensive APIs.
Alibaba Cloud: Targeting the Asia-Pacific market.
DigitalOcean: Straightforward hosting and databases.
AWS: Extensive but potentially challenging documentation and support.
|
|
|
|
|
I do Azure because I have been a Microsoft developer forever and so it's easiest for me and Microsoft is awesome.
Social Media - A platform that makes it easier for the crazies to find each other.
Everyone is born right handed. Only the strongest overcome it.
Fight for left-handed rights and hand equality.
|
|
|
|
|
Everything in the technology world comes in a full package. There will be pros and cons all along.
Whether you should be bothered about the cons or not, it is all down to the usage. May I know what kind of business application you are talking about?
|
|
|
|
|
From my experience with both Azure and AWS, I found Azure better than AWS as Microsoft carry well architected Hybrid solutions and security offerings than others.
|
|
|
|
|
I am learning more about Artificial intelligence these days so I am working on AWS. It is smooth and efficient.
|
|
|
|
|
It's really hard to tell anything without knowing what are your requirements. For instance, for my pet project I've chosen oracle cloud as its free tier allows me to have multiple VM up to 24Gb RAM in total and since I don't anticipate big loads having 12Gb VMs with load balancer for free is quite neat.
In case you're serving a static site CDNs might be a better option since your code will be deployed to multiple edges.
Also, the cloud is pretty hot but its use cases are the ability to scale dynamically and eliminating the ops. Otherwise you just might consider one big server[^]
|
|
|
|
|
So, I've "played" with Azure a couple of times in the past, and found it MASSIVELY confusing - vast number of different services, huge number of (wildly varied) pricing options, dozens of pages of configuration, huge amount of new terminology. Really not into that, I'm a developer at heart, I like writing code.
I have a client with a critical line-of-business ASP.Net Webforms website (not web application), plus a separate sub-domain for testing. He would like to move this away from his cheap-as-chips shared hosting, and onto Azure. (Two different hosting providers keep messing us around by changing configurations that break the site).
So, is there a SIMPLE migration tool to help me move this site into Azure? A complication is that it uses MySql so we need a MySql service on Azure too (with 2 separate databases, 1 prod 1 UAT). The MS Azure site has reams of stuff about migrating, but it looks like it assumes you control a dedicated server. Doesn't necessarily need to be a tool, a comprehensive tutorial / walkthrough would be fine, BUT it would need to be up-to-date (When tinkering with Azure a couple of years ago I found not only was it complex, it was constantly shifting too!).
Also anyone else's experiences with this migration path would be welcome, as I will need to give the client an estimate of my time on this migration. And once done, I will still need to connect from my existing d/b client (e.g. Heidi SQL, MySql Workbench) to the databases, and be able to EASILY upload changed pages, access IIS logs etc. I did at one point get the app running on Azure (linking back to the existing hosted MySql d/b) so I know it will run with minimal changes, just finding the whole thing pretty daunting right now.
|
|
|
|
|
Quote: critical line-of-business ASP.Net Webforms website (not web application) Web Forms is a web application, and should be just fine on Microsoft Azure.
ASP.NET Web Deployment using Visual Studio: Introduction | Microsoft Docs
Quote: plus a separate sub-domain for testing. Microsoft Azure provides this built-in to the platform; Deployment Slots.
Quote: is there a SIMPLE migration tool to help me move this site into Azure? Depends, if you can connect to the MySQL database locally and create a backup, Microsoft might have some tools to deploy the MySQL snapshot to Azure platform.
Quote: A complication is that it uses MySql so we need a MySql service on Azure too (with 2 separate databases, 1 prod 1 UAT). I recommend using Terraform to create the infrastructures; dev and UAT. The UAT environment can be created when test is needed and then disposed when not needed.
Quote: (When tinkering with Azure a couple of years ago I found not only was it complex, it was constantly shifting too!). Sadly it still is. But if you can get a subscription for MySQL engine on Azure, running the backup/restore should be just fine.
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|
Many thanks for your response. Afzaal Ahmad Zeeshan wrote: Web Forms is a web application, and should be just fine on Microsoft Azure. Nope. ASP.Net webforms allows you to create two types of web site - a "Web Application", which involves pre-compiling code into DLLs (all done by VisualStudio) and you implement the DLLs and .ASPX pages; and "Web Site", where the runtime does the compilation of each code-behind page when first called. It's therefore critical that Azure includes those runtime components or my site would not run as-is.
Afzaal Ahmad Zeeshan wrote: Microsoft Azure provides this built-in to the platform; Deployment Slots. But only at the higher-end packages. Because the entire slot is "swapped" there's no way during the swap to also swap from UAT to production database if using a standard connection string. Also the IIS InstanceID presumably changes during the swap, so there is no way for an end-user to determine which environment they're looking at on-screen. (I use the InstanceID to make CSS changes, including adding environment name, to reduce the risk of making "test" transactions in the production environment; see IIS_Environment_Detection[^]
A quick scout around just the "Deployment slots" documentation just now confirms Azure remains obscure and overly-complex for "normal" website applications.
|
|
|
|
|
Quote: Nope. ASP.Net webforms allows you to create two types of web site - a "Web Application", which involves pre-compiling code into DLLs (all done by VisualStudio) and you implement the DLLs and .ASPX pages; and "Web Site", where the runtime does the compilation of each code-behind page when first called. It's therefore critical that Azure includes those runtime components or my site would not run as-is. Right, thanks for the information. From what I "think", Visual Studio should be able to handle this situation.
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|
We currently have an intranet portal which uses IMAP to read email from a shared mailbox on Office365, using the excellent MailKit[^] library.
I've just been made aware that Microsoft is going to disable basic authentication for IMAP clients. Originally, this was going to happen in October, but it's since been pushed back to the second half of 2021.
Basic Auth and Exchange Online – February 2020 Update - Microsoft Tech Community - 1191282[^]
After some panicked reading and experimentation (and much swearing), I've managed to get the MSAL[^] library to return an OAuth2 access token:
const string ClientId = "...";
const string Username = "...";
SecureString password = ...;
var scopes = new[] { "https://outlook.office365.com/.default" };
var app = PublicClientApplicationBuilder.Create(ClientId).WithAuthority(AadAuthorityAudience.AzureAdMultipleOrgs).Build();
var tokenResult = await app.AcquireTokenByUsernamePassword(scopes, Username, password).ExecuteAsync(cancellationToken); After configuring the application as a "public client" in the Azure portal, giving it Mail.ReadWriteAll and Mail.SendAll permissions, and granting admin consent for my organisation, this code now returns a seemingly-valid access token.
According to the author of the MailKit library[^], all I need to do now is use the token to authenticate:
using var client = new ImapClient();
await client.ConnectAsync("outlook.office365.com", 993, SecureSocketOptions.Auto, cancellationToken);
await client.AuthenticateAsync(new SaslMechanismOAuth2(Username, tokenResult.AccessToken), cancellationToken); Unfortunately, that simply throws an "authentication failed" exception.
MailKit.Security.AuthenticationException
HResult=0x80131500
Message=Authentication failed.
Source=MailKit
StackTrace:
at MailKit.Net.Imap.ImapClient.<AuthenticateAsync>d__81.MoveNext()
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.ConfiguredTaskAwaitable.ConfiguredTaskAwaiter.GetResult()
at TestOffice365OAuth.Program.<MainAsync>d__1.MoveNext()
at System.Runtime.ExceptionServices.ExceptionDispatchInfo.Throw()
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter`1.GetResult()
at TestOffice365OAuth.Program.Main() I don't know whether this is a bug in Microsoft's implementation, a bug in MailKit, a configuration error with my application, or a mistake in my code.
Has anyone managed to get Office365 IMAP access working with OAuth2? Can you spot anything I've missed?
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
modified 28-Apr-20 6:21am.
|
|
|
|
|
Thanks to "paulflo150" on GitHub, I was able to authenticate with the token by changing the scopes to:
var scopes = new[] { "https://outlook.office365.com/IMAP.AccessAsUser.All" }; Now I need to find out how to connect to a shared mailbox. The usual trick of appending "\shared-mailbox-alias" to the username results in the same "authentication failed" error, and if I authenticate without it there are no "shared namespaces" available.
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Finally managed to get the code to access a shared mailbox, with help from Sivaprakash Saripalli at Microsoft. It's simply a case of passing the email address of the mailbox instead of the username to the SaslMechanismOAuth2 constructor.
const string ClientId = "...";
const string UserName = "...";
SecureString Password = ...;
const string Mailbox = "...";
var scopes = new[] { "https://outlook.office365.com/IMAP.AccessAsUser.All" };
var app = PublicClientApplicationBuilder.Create(ClientId).WithAuthority(AadAuthorityAudience.AzureAdMultipleOrgs).Build();
var authenticationResult = await app.AcquireTokenByUsernamePassword(scopes, UserName, Password).ExecuteAsync(cancellationToken);
using var client = new ImapClient();
await client.ConnectAsync("outlook.office365.com", 993, SecureSocketOptions.Auto, cancellationToken);
await client.AuthenticateAsync(new SaslMechanismOAuth2(Mailbox, authenticationResult.AccessToken), cancellationToken);
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
I don't know if you have a problem with English or if you are trying to increase your reputation by randomly posting irrelevant crap; either way please stop or it will be treated as site abuse and the BanHammer will descend upon you.
If it's a problem with English, write your message in your native language then use Google Translate to get an English version.
If it's a reputation thing, then stop immediately.
"I have no idea what I did, but I'm taking full credit for it." - ThisOldTony
"Common sense is so rare these days, it should be classified as a super power" - Random T-shirt
AntiTwitter: @DalekDave is now a follower!
|
|
|
|
|
The BanHammer is already swinging[^].
"These people looked deep within my soul and assigned me a number based on the order in which I joined."
- Homer
|
|
|
|
|
Hi Sir,
Need help, i noob in Asp.net. Using VB for my coding;
Situation :-
a) Using Master Page
b) Got 3 web Form ( Staff.aspx,student.aspx,external.aspx)
c) Every form have Search Textbox
In Staff.aspx, search Textbox working great when connecting to Oracle Database and appear the data holder in other textbox.
But when using Student.aspx or External.aspx, the Search Textbox will appear No Data / Cardholder , in database the data is exist.
the Sql query it ok when i direct include the User ID but when keep in in textbox cannot appear the data.
Plz guide me
Tq
|
|
|
|
|
Why have you posted this same question a second time? Please post questions once only.
|
|
|
|
|
I have met the following problem. I have a pipeline that reads data from an MS SQL Server and stores them into a file in a BLOB container in Azure Storage. The file has Parquet (or Apache Parquet, as it is also called) format.
So, when the “sink” (output) file is stored in a compressed way (snappy, or gzip – does not matter) AND the file is large enough (more than 50 Mb), the pipeline failed. The message was the following:
"errorCode": "2200",
"message": "Failure happened on 'Sink' side. ErrorCode=UserErrorJavaInvocationException,'Type=Microsoft.DataTransfer.Common.Shared.HybridDeliveryException,Message=An error occurred when invoking java, message: java.lang.OutOfMemoryError:Java heap space\ntotal entry:11\r\njava.util.ArrayDeque.doubleCapacity(Unknown Source)\r\njava.util.ArrayDeque.addFirst(Unknown Source)\r\njava.util.ArrayDeque.push(Unknown Source)\r\norg.apache.parquet.io.ValidatingRecordConsumer.endField(ValidatingRecordConsumer.java:108)\r\norg.apache.parquet.example.data.GroupWriter.writeGroup(GroupWriter.java:58)\r\norg.apache.parquet.example.data.GroupWriter.write(GroupWriter.java:37)\r\norg.apache.parquet.hadoop.example.GroupWriteSupport.write(GroupWriteSupport.java:87)\r\norg.apache.parquet.hadoop.example.GroupWriteSupport.write(GroupWriteSupport.java:37)\r\norg.apache.parquet.hadoop.InternalParquetRecordWriter.write(InternalParquetRecordWriter.java:123)\r\norg.apache.parquet.hadoop.ParquetWriter.write(ParquetWriter.java:292)\r\ncom.microsoft.datatransfer.bridge.parquet.ParquetBatchWriter.addRows(ParquetBatchWriter.java:60)\r\n,Source=Microsoft.DataTransfer.Common,''Type=Microsoft.DataTransfer.Richfile.JniExt.JavaBridgeException,Message=,Source=Microsoft.DataTransfer.Richfile.HiveOrcBridge,'",
"failureType": "UserError",
"target": "Work_Work"
}
The "Work_Work" is name of a Copy Data activity in the pipeline.
If I turn the compression off (the generated BLOB file is uncompressed), the error does not happen.
Is this the error described in https://docs.microsoft.com/en-us/azure/data-factory/format-parquet: the “…If you copy data to/from Parquet format using Self-hosted Integration Runtime and hit error saying "An error occurred when invoking java, message: java.lang.OutOfMemoryError:Java heap space", you can add an environment variable _JAVA_OPTIONS in the machine that hosts the Self-hosted IR to adjust the min/max heap size for JVM to empower such copy, then rerun the pipeline….”?
If it is, have I understood correctly that I have to do the following:
To go to a server where the “Self-hosted Integration Runtime” (still have no idea what it is) and increase the max heap size for JVM. Is this correct?
If it is, my next question is: how large the max heap size should be? My pipeline can generate a file whose size will be 30 GB.
What “max heap size” can guarantee that such a file will not cause the fail?
|
|
|
|
|
Hello everyone,
I am having a question. I want to make a website scanner that allows me to scan the whole website for one word or one color. This because there isn't one available (at least i couldn't find one).
I am not a programming wonder and i really hope or you could help me. I would really appreciate it.
Thanks in advance!
Regards,
Lars
|
|
|
|
|
You need a web scraping solution, something like this in .NET world, Html Agility pack | Html Agility Pack.
You can find out other packages in other languages and runtimes too if you find. The basic concept is you download the entire web page as HTML document and this tool parses it down. You then read the document and look for your "word or color". That is the closest of what I can think of.
The sh*t I complain about
It's like there ain't a cloud in the sky and it's raining out - Eminem
~! Firewall !~
|
|
|
|
|
To find a specific word or phrase on a web page on your computer
open a webpage in Chrome.
At the top right, click More More and then Find.
Type your search term in the bar that appears in the top right.
Press Enter to search the page.
Matches appear highlighted in yellow. You can see where all the matches are located on a webpage using the yellow markers on the scrollbar.
I hope this helps!
Ben Martin
|
|
|
|
|
Visit www.google_com/ in your browser.
Enter site: followed by the site address you want to search.
Enter the word or phrase you want to search for after the site address.
Regards,
Peter
|
|
|
|
|
part rant, part some details I have found so far.
I am trying to figure out what the basic cost is to run various Azure app setups.
Like a simple todo web application - connected to a very simple data store.
But when most searches just end with using the Pricing calculator, for me it is an overload of information.
Very simple - the minimum use of Cosmos DB is $23.61 per month. But then you need to add what other components??????
So if anyone knows of some resources providing to developers the minimum costing to run some small applications, that would be great?
now some info:
(most numbers based on West Europe/UK zone selection, which is most cases is the same as US pricing)
Prices are Per Month
Asure Functions: $0.00 - it takes almost 1 million 2 second calls before 1$ would be hit.
Cosmos DB: minimum 4 x100 RU/sec + 1 GB = $23.61
Storage Account (Block Blob Storage): 1GB with 1000 on all operations: £1.25
Storage Account (Block Blob Storage): 1GB with 10,000 operations: $0.11 (is this right?)
Storage Account (Table Stoage): 1GB with 100,000 transactions: $0.09
Storage Account (Data Lake gen2): 1GB with 10,000 on each operation type: $0.74
Asure SQL Database: this is over $1,699.22 per month at a minimum???
App Service: $68.62
So apart from Cosmos DB, for entry level mess about with azure for storage, AFTER FREE account expired, some seem sensible.
|
|
|
|
|