In this tutorial, we will discuss about how to publish .NET Core solution to VSTS – Visual Studio Teams Service Online Git Repository using Git Command Line.
1. Copy the Clone URL for your Git Repository. Incase, if you are new to VSTS, follow the steps in this link to Create Repository in Visual Studio Team Service(VSTS) online,
2. This tutorial assumes you already have an existing .NET project or Solution, which needs to be pushed. Incase, you dont have any project, create one in Visual Studio Community Free IDE(File-> New Project->Web->(choose .NET Core)Web API & select Create New Solution.
3. Git For Windows has been installed in your local machine.
4. Open Windows Command Prompt(Start-> Run-> Type ‘cmd’) or Windows Powershell or Open Terminal inside VS code) with Project Folder/solution Path.
Lets see how to publish Local Solution to Git VSTS Online using simple Git commands
1. Initialize Git by running below command:
git init .
Above command initialize empty Git repository in current directory inside Project Folder
2. Connect to Remote Branch of Online Repository use below command:
Above command helps to establish connection between Local repository to VSTS online Git Repository.
3. Push your local changes using below command:
git push origin
git push origin <--branch or Feature>
Above command helps to Push your manually updated code changes to the Origin repository before final commit to Online branch.
4. Add all your local File changes as ‘Stage’ by using below command:
git add --all
Above command helps to list all the files, which are finalized to push into Online Repository.
5. Finally, to commit & push your changes Run the below command:
git commit -m "Initial Push to Online Repo from Local"
If succeed, you will see the Status as shown below with all the details of File count, Compressed Objects, Number of threads Modified.
While doing Final commit, local repository code is modified as Git Objects, then generated as Package file with Index. On each addition to commit, index will get modified with ‘message -Flag’ value & these changes can be best viewed with Tree view of commits available in VSTS Online –> Code(History View)–> Sort By Tree View.
Above solution helps to publish code through Command Line Interface. Sometimes, we might be challenged to publish code to Online repositories from our local desktop with out visual studio IDE.
In this Tutorial, we will discuss about the factious & advantages of Azure Data Storage Options over the Classical On-premise cost consuming Storage Model.
So, What we will be reading for next 3 minutes. Yes!
We will explore different Azure Data Storage Options,
Comparison between Azure Data Storage & On-premise Data Storage
As Microsoft said their next 5 years will be heavily focused on Internet Of Things & Artificial Intelligence. To concur this, they have proposed strong Platform for storing Data in Cloud, as discussed earlier in Core Services Platform post for PaaS, Iaas & SaaS.
Why Should we Choose Azure Data Storage?
Automated backup and recovery: No need for 24×7 support by Admin Team.
Replication across the globe: Microsoft Data centers are located across the Globe in 8 Continents & 16 Places. Hence, data will never get lost, if we follow the Security Model, Storage Model & Governance Model proposed by Microsoft.
Support for data analytics: Microsoft Azure Data Lake Storage Gen2, App Insights will take care of your Application Data & Site Analytics in-depth.
Encryption capabilities: We might winder, if we use Public Cloud adopted by Microsoft. DO I need to worry about my Data Storage Leakage or sharing. Well, Microsoft has clearly depicted its capabilities through Container Virtualization, VM Virtualization & Cryptographic Key policies for Multi Tenant Workspace.
Multiple data types: BLOB, File, Queues helps you to save different types of File Format & its Server rendition Models.
Data storage in virtual disks: Azure can store only up to 8TB of data in Virtual disks. Well, good option tobe considered for Large Blob Images, Videos, Multimedia Formats & its Search Features.
Storage tiers: Auto assignment of VMs or Storage Containers, helps our clients to sleep peacefully ( Provided, if Budget limit is setted in Azure Portal using Azure Cost Management Preview Feature).
What are Types of data Azure Supports?
Microsoft arrived at the below data types based on the Storage Design Requirements. Lets see few examples:
Structured – Data with Proper Schema. Ex- Azure SQL Database as Database as a service (DaaS)
Semi-Structured – Data can be identified by Unique Keys or Tags, can contain ‘N’ number of Columns Ex: Azure Cosmos DB, No SQL Database
Azure SQL Database is a relational database as a service (DaaS)
SQL Database is a high-performance, reliable, fully managed and secure database.
Migrate existing SQL Server databases using the Azure Database Migration Service(Backend Service –Microsoft Data Migration Assistant)
2. Azure Cosmos DB
Azure Cosmos DB is a globally distributed database service built for PaaS Strategy,
Supports Schema less database – you can update Metadata at any point of time. Ex: I have created Azure Cosmos DB & Collection, added First Row or Item with 2 Metadata(Title, ID). In Second Row, I can push new Metadata Fields like( Image URL, Link URL, Description) in addition to existing Fields. It is one of the super cool Feature for Azure developers and it supports SQL Queries also.
Global availability makes it Top DB tier for Architects choice.
3.Azure Blob storage
Azure Blob Storage is unstructured,
Blob Storage can manage thousands of simultaneous uploads, massive amounts of video data, constantly growing log files, and can be reached from anywhere with an internet connection.
Blobs aren’t limited to common file formats. A blob could contain gigabytes of binary data streamed
Ability to store up to 8 TB of data for virtual machines
4. Azure Data Lake Storage Gen2
Contains both structured and unstructured data.
Mainly Used for Analytic purposes.
Have Big Data File System capabilities
5. Azure Files
Azure Files offers fully managed file shares in the cloud that are accessible via the industry standard Server Message Block (SMB) protocol.
Azure file shares can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS.
Azure Files uses the Server Message Block (SMB) protocol that ensures the data is encrypted at rest and in transit.
6. Azure Queues
Azure Queue storage is a service for storing large numbers of messages that can be accessed from anywhere in the world.
A single queue message is up to 64 KB in size, and a queue can contain millions of messages.
Consist of Sender & Receiver Components for sharing messages. A sender can be Web API, Web App, Mobile App and receiver can be any service application capable of listening or accepting to messages.
Mainly used for distributing loads between servers to balance the Network Load.
Comparison between Azure Data Storage & On-premise Storage Options:
Azure Capabilities over On-premise Servers in simple terms based on our experience. Add your Pros & Cons at comments section.<<will be added here with Courtesy links>>
A nearly limitless pool of raw compute, storage, and networking components which will expand dynamically based on Customer Utilization. Whereas, On-premise for each additional Server, Database or Load Balancer, needs to pay upfront fee for Infrastructure & Admin dependency is required.
Speech recognition and other cognitive services that help make your application stand out from the crowd. Whereas, need to identify third-party tools for each set of Enterprise Applications
Analytics services that enable you to make sense of telemetry data coming back from your software and devices. Whereas, in On-premise analytics is not 90%. Since it will make additional calls or it will be dependent on browser Agent. Third party Tools like AppD, Omniture still lacks a long way behind Microsoft Telemetry services.
Azure Data stored can be mounted concurrently by cloud or on-premises deployments of Windows, Linux, and macOS. Whereas, server needs to be identified for Different Softwares. Needs tobe categorised as WFE, Application, Database Server. Needs dedicated Team & Disaster Mode/ Back up recovery options should be considered.
Azure Search supports data crawling, Artificial Intelligence, Cognitive Support, Telemetry Analysis. Whereas, On-premise need to configure Several Application Servers like SharePoint Search Database Content, Crawled Database Content, Indexed Database content, Business Intelligence License procurement.
Automation Deployment Model supports Hybrid deployment to Azure/ On premise both using Azure DevOps/ Continuous Integration using Multi Platform Gates like Powershell, NodeJS, Jenkins(1000+ Marketplace Tools)/ Continuous Build/ Container Image Deployment Model. Whereas, in On-premise still Automation is night mare requires huge customization & procurement.
Azure Governance Model is well protected by Inbuilt Security Policies, Compliance Policy Centers, Encryption Policies, Replication & Disaster Recovery Policies. Whereas, Data Classification, Policy Identification & Data Breach Identification requires support from Legal Agencies, Enterprise Consultants & continuous data watch required.
Azure keeps the Operation Cost at Minimal & Running Cost depending upon utilization. Disk Storage( with Tier Model – Hot, Cool, Archival Model) Cost Management predicted seems to be 98% accurate. Hence, clients can start preparing their budget based on the Architecture proposed & adopted in well advance time. Whereas, In On-premise Performance, Network calls, Operation Threshold Limit can affect Operation cost & running cost. For each Additional resource requires Procurement, Configuration, Admin Support & Budget Allocation in middle of nowhere :)
We can summarize above points in terms of:
Cost effectiveness (Ex – pay-as-you-go subscription Model),
Reliability (Ex- Data backup, load balancing, and disaster recovery strategies),
For create a UDCX file for Main data connection, you may follow these steps as:
Open your InfoPath form and clik on “Manage Data Connection…” under the Fields in the InfoPath form, and then in new pop up window, select your Main Connection and click on “Convert to Connection file” as:
and then give your Data connection library’s path (on SharePoint site) in the textbox as:
for example: http://sharepoint_site/DataConnectionLibrary/ABC_Main_Connection.UDCX
and click on OK.
After this you can see your Main Connecion’s UDCX file is available in your site’s Data Connection Library.
Make sure you have a logo, screenshots and some descriptive text ready for the app submission. A version of your .app file that has been compiled for Release.
Decided how you are going to licence your app. The app store itself allows you to define how the app will be licensed, will it be free, will it be per purchase, per user, will there be a trial etc. Some of these decisions are not simple and require significant forethought and in some cases additional development work. For our app I decided to keep it simple and go for a free version. Microsoft published a couple of great blogs / articles helping with the licencing over at the Office apps blog.
Finally, make sure you have tested, tested and tested your app again, the submission process is very thorough and tests the functionality of your app across not only IE but all supported SharePoint 2013 browsers.
Once all of the above is ready, submitting your app is relatively simple. Navigate to the Seller Dashboard and follow the prompts to submit the app.
First choose a listing type, our app is for Project Server, so we need to choose an app for SharePoint, then click on next.
In the next screen you will be asked some information about your app like the name, version, category to list it under and some other bits and pieces. The most important part are the testing notes, these are your only real way of passing information through to the testers who are looking at your app.
As we are making the app available to everyone, there is no need to choose Trial support. Click on Next. The final bits to add before you can submit the app are screenshots and some descriptive text and links to support, EULA and Privacy policies.
Once you’ve added that text, click on Next and your ready to submit for validation. From experience, the validation process can take around 3-5 working days. Unfortunately at the moment there is no progress indicator of where you are in the process, with the app either being in a Draft or Approved state. Once the app has become approved, it takes a few hours for it to propagate down into the SharePoint app store and to become available for everyone to download and start using.