3 Multicloud Tips for Cloud Architects

3 Multicloud Tips for Cloud Architects


Have an open mind about all potential solutions. 

Never push back on a solution before you understand the problem. Multicloud is a solution pattern. If you announce that multicloud is a bad solution, that statement comes off as a bit disingenuous if you don’t understand the specific problem you need to solve.

Like all other IT architectures, multicloud is not the answer to all problems. However, it is a clear fit for some. It’s important to keep an open mind; otherwise, you’ll just provide opinions, not solutions. If you don’t consider all the possibilities, you’ll likely end up with an underoptimized architecture that becomes a value drain for the business for years to come.

Implement federated security architectures and evaluate other cross-cloud solutions. 

For most architects, it’s simple to figure out a single cloud deployment’s security: Pick the security layers that the primary cloud provider recommends. This usually includes identity management, directories, encryption, even industry-specific security support, such as for healthcare or finance. These solutions work—for the most part—they are well supported, and although they are sometimes not as cost-efficient as I would like, they do hold their own.

Multicloud is a different story. If you use the same single-cloud security approach to multicloud, the number of moving parts will quickly create too much complexity and this becomes a security issue unto itself.

A better method is to use cross-cloud security services. They provide the same security service layers, implemented using whatever native security services are needed for each specific cloud provider. This gives you a single stack of technology that has a single approach and interface for security operations (secops), including the ability to launch a unified defense in case of attacks.

Understand and deal with cloud finops. 

A cloud finops solution for multicloud is critical to success if you have more than one or two clouds to track, each with different terms, pricing, and service-level agreements to manage. Even if you get cloud cost tracking down pat, finops also includes cloud spending observability and, most important, cloud spending optimization. This means you can identify ways the multicloud solution could support all applications and data storage systems more cost-effectively. Simple examples would be to use reserved instances to drive discounts for purchasing capacity before it’s needed or to proactively remove instances that are no longer needed, etc.


You can read more about Multicloud Tips here.

Teknita has the expert resources to support all your technology initiatives.
We are always happy to hear from you.

Click here to connect with our experts!

Top 8 Benefits of Cloud Document Management Systems

Top 8 Benefits of Cloud Document Management Systems


1. Cloud Document Management Systems Enable Savings on Equipment Costs

The obvious benefit of cloud document management systems is the same as all other digital technologies – they take away the need for “physical” solutions.

Both, traditional and electronic in-house document management systems incur regular maintenance costs as well which are much higher than what cloud-based document management systems require. Using cloud management systems not only eliminates the initial acquisition cost of equipment but also costs related to their maintenance.

2. They Take Away Spatial Constraints

Space is often also a constraint for start-ups and small to medium scale businesses (SMBs). As funding is limited for these types of businesses and commercial property costs significant in both leasing and buying forms, cutting space requirements can go a long way in increasing savings as well. Traditional and in-house document management systems tend to take a lot of space. This space doesn’t only draw on the funds of the company but also results in cramped workspaces which can affect productivity.

3. They Offer Quicker Deployment

Most cloud document management systems are end-to-end turnkey type offerings. They only require the user to register the company, make the right payment, and create the relevant user profiles. Most of these services are offered through web apps which is why they don’t even require the users to install desktop apps.

4. They Offer Better Security

Cloud document management systems improve document security from both types of threats deliberate and natural. Risks posed by external agencies are countered with elaborate encryption algorithms and firewalls. Similarly, risks arising out of natural disasters are also managed with redundancy protocols where data is backed up regularly to prevent it from being lost forever.

5. They Provide Easy Scalability

Scalability can be a major concern for both start-ups and SMBs. The objective of a business is to grow but if growth comes with added costs that are greater than the growing profits, then further growth can stutter. There is a direct correlation to document management here. With cloud document management systems, the current package can be upgraded for a small increase in the rates. This is why cloud-based systems are known to offer better scalability than traditional or electronic in-house systems.

6. They Improve Productivity

This is possibly the biggest benefit of cloud document management systems. Since they help the business save money, they increase productivity – also improve productivity by saving time. This time is saved in various ways.

First, they help save time by improving accessibility. They allow documents to be accessed from any location and at any time. More importantly, they increase the speed of collaboration between employees by making workflows more efficient and result oriented. This means that the employees’ time is better utilized and projects are turned around faster.

7. They are Environment-Friendly

Cloud document management systems are also more environmentally friendly than conventional systems. This is made possible by the principle of crowdsourcing. Because the same equipment is shared by more than one client, economies of scale reduce the carbon footprint of the service provider and, effectively, the clients.

8. They Reduce IT Support Dependency

Cloud document management systems free up the IT support teams of their clients. Since third-party service providers maintain their own equipment, the in-house IT teams don’t need to get involved in things like software updates, hardware maintenance, network management, licencing requirements, user monitoring, and even backup creation.

This can either result in IT teams being shrunk to suit the reduced support requirement or direct improvement in the efficiency of the office equipment.


You can read more about Benefits of Cloud Document Management Systems here.

Teknita has the expert resources to support all your technology initiatives.
We are always happy to hear from you.

Click here to connect with our experts!

4 Ways to Build a Knowledge Base in SharePoint

4 Ways to Build a Knowledge Base in SharePoint


Custom List

The SharePoint Custom List exists forever, but there was a lot of changes over the years – the list became modern and easy to use. In addition, you can now set unique permissions on individual rows within a list, giving various contributors the ability to edit their own entries. Moreover, you can also format the list now, giving your knowledge base a modern look. At a minimum, you can easily create columns (metadata), categorizing the entries any way you want.

Pages with metadata

The idea behind the second option is that instead of info being stored in a row within a list, each entry gets its own SharePoint page. This, of course, gives you lots of flexibility in terms of content (text, images, videos, etc.), and you get far more real estate to store the information. You can utilize some additional features available within the Site Pages library (that is where all the SharePoint pages are stored) to spice up the knowledge base built with SharePoint pages:

Ability to create page template – you can standardize the look and feel of every wiki article
Ability to create custom metadata on the Pages document library and display it on the article itself

All the pages built on a site are searchable by the mighty SharePoint search, so you can use keyword search and metadata filtering if you opt for metadata.

Page with Collapsible Sections

The third option SharePoint recently got, is a cross between the two previous options. If you like the flexibility of a list with its ability to group information by question/answer, yet, like the capability of a page to store text, images, and other web parts, you might want to check out collapsible sections.

Viva Topics

Finally, SharePoint has an option to create a Knowledge Base that is based on AI as well as manual input. This is possible thanks to the newly released Viva Topics, a module within the Viva Platform. It is contextual option – the topics might appear during a Teams conversation, SharePoint Search, News Posts, etc.


You can read more about SharePoint here.

Teknita has the expert resources to support all your technology initiatives.
We are always happy to hear from you.

Click here to connect with our experts!

Visual Studio vs. Visual Studio Code

Visual Studio vs. Visual Studio Code


Choosing between Visual Studio Code and Visual Studio is not so simple. While Visual Studio Code is highly configurable, Visual Studio is highly complete. The choice may depend as much on your work style as on the language support and features you need.

Let’s take a look at the capabilities of these two development tools.

Visual Studio Code

Visual Studio Code is a lightweight but powerful source code editor that runs on your desktop and is available for Windows, macOS, and Linux. It comes with built-in support for JavaScript, TypeScript, and Node.js and has a rich ecosystem of extensions for other languages (such as C++, C#, Java, Python, PHP, and Go) and runtimes (such as .NET and Unity).

Aside from the whole idea of being lightweight and starting quickly, VS Code has IntelliSense code completion for variables, methods, and imported modules; graphical debugging; linting, multi-cursor editing, parameter hints, and other powerful editing features; snazzy code navigation and refactoring; and built-in source code control including Git support. Much of this was adapted from Visual Studio technology.

VS Code proper is built using the Electron shell, Node.js, TypeScript, and the Language Server protocol, and is updated on a monthly basis. The extensions are updated as often as needed. The richness of support varies across the different programming languages and their extensions, ranging from simple syntax highlighting and bracket matching to debugging and refactoring.

The code in the VS Code repository is open source under the MIT License. The VS Code product itself ships under a standard Microsoft product license, as it has a small percentage of Microsoft-specific customizations. It’s free despite the commercial license.

Visual Studio

Visual Studio (current version Visual Studio 2022, which is 64-bit) is Microsoft’s premier IDE for Windows and macOS. With Visual Studio, you can develop, analyze, debug, test, collaborate on, and deploy your software.

On Windows, Visual Studio 2022 has 17 workloads, which are consistent tool and component installation bundles for different development targets. Workloads are an important improvement to the Visual Studio installation process, because a full download and installation of Visual Studio 2022 can easily take hours and fill a disk, especially an SSD.

Visual Studio 2022 comes in three SKUs: Community (free, not supported for enterprise use), Professional ($1,199 first year/$799 renewal), and Enterprise ($5,999 first year/$2,569 renewal). The Enterprise Edition has features for architects, advanced debugging, and testing that the other two SKUs lack.

Visual Studio or Visual Studio Code

If your development style is test-driven, Visual Studio will work right out of the box. On the other hand, there are more than 15 test-driven development (TDD) extensions for VS Code supporting Node.js, Go, .NET, and PHP. Similarly, Visual Studio does a good job working with databases, especially Microsoft SQL Server and its relatives, but VS Code has lots of database extensions. Visual Studio has great refactoring support, but Visual Studio Code implements the basic refactoring operations for half a dozen languages.

There are a few clear-cut cases that favor one IDE over the other. For instance, if you are a software architect and you have access to Visual Studio Enterprise, you’ll want to use that for the architecture diagrams. If you need to collaborate with team members on development or debugging, then Visual Studio is the better choice. If you need to do serious code analysis or performance profiling, or debug from a snapshot, then Visual Studio Enterprise will help you.

VS Code tends to be popular in the data science community. Nevertheless, Visual Studio has a data science workload that offers many features.

Visual Studio doesn’t run on Linux; VS Code does. On the other hand, Visual Studio for Windows has a Linux/C++ workload and Azure support.

For daily bread-and-butter develop/test/debug cycles in the programming languages supported in both Visual Studio and VS Code, which tool you choose really does boil down to personal preference.


You can read more about Visual Studio and Visual Studio Code here.

Teknita has the expert resources to support all your technology initiatives.
We are always happy to hear from you.

Click here to connect with our experts!

What is a Data Lake

What is a Data Lake


James Dixon described the data lake:

If you think of a data mart as a store of bottled water—cleansed and packaged and structured for easy consumption—the data lake is a large body of water in a more natural state. The contents of the data lake stream in from a source to fill the lake, and various users of the lake can come to examine, dive in, or take samples.

data lake is essentially a single data repository that holds all your data until it is ready for analysis, or possibly only the data that doesn’t fit into your data warehouse. Typically, a data lake stores data in its native file format, but the data may be transformed to another format to make analysis more efficient. The goal of having a data lake is to extract business or other analytic value from the data.

Data lakes can host binary data, such as images and video, unstructured data, such as PDF documents, and semi-structured data, such as CSV and JSON files, as well as structured data, typically from relational databases. Structured data is more useful for analysis, but semi-structured data can easily be imported into a structured form. Unstructured data can often be converted to structured data using intelligent automation.

Data lake vs data warehouse

The major differences between data lakes and data warehouses:

  • Data sources: Typical sources of data for data lakes include log files, data from click-streams, social media posts, and data from internet connected devices. Data warehouses typically store data extracted from transactional databases, line-of-business applications, and operational databases for analysis.
  • Schema strategy: The database schema for a data lakes is usually applied at analysis time, which is called schema-on-read. The database schema for enterprise data warehouses is usually designed prior to the creation of the data store and applied to the data as it is imported. This is called schema-on-write.
  • Storage infrastructure: Data warehouses often have significant amounts of expensive RAM and SSD disks in order to provide query results quickly. Data lakes often use cheap spinning disks on clusters of commodity computers. Both data warehouses and data lakes use massively parallel processing (MPP) to speed up SQL queries.
  • Raw vs curated data: The data in a data warehouse is supposed to be curated to the point where the data warehouse can be treated as the “single source of truth” for an organization. Data in a data lake may or may not be curated: data lakes typically start with raw data, which can later be filtered and transformed for analysis.
  • Who uses it: Data warehouse users are usually business analysts. Data lake users are more often data scientists or data engineers, at least initially. Business analysts get access to the data once it has been curated.
  • Type of analytics: Typical analysis for data warehouses includes business intelligence, batch reporting, and visualizations. For data lakes, typical analysis includes machine learning, predictive analytics, data discovery, and data profiling.

You can read more about Data Lake here.

Teknita has the expert resources to support all your technology initiatives.
We are always happy to hear from you.

Click here to connect with our experts!

The 6 R’s of Cloud Migration Strategy

The 6 R’s of Cloud Migration Strategy


ERP is a mission-critical application that connects all operations, from sales and customer management to inventory and finance. It provides decision-makers with the desired visibility and enhances collaboration across teams. ERP systems must perform faster and handle more capacity. Must support new technologies such as Machine Learning, Artificial Intelligence, Digital Assistants and more. A cloud-based ERP can help organizations achieve the same, which makes it imperative for them to modernize their ERP by migrating it to the Cloud.

There are six effective approaches, commonly known as “The 6 R’s of Cloud Migration”.

1. REHOST (i.e. Lift and Shift)

The essence of “Lift and Shift” is to quickly enjoy the CAPEX and OPEX and other benefits of Cloud IaaS. This is like a getting out of the data center which leads to significant cost savings on valuable office space and the amounts of money spent to avoid overheating/maintenance of the data centers.

2. REPLATFORM (i.e. Lift, Thinker, Shift)

Replatforming is the middle ground between three approaches wherein the code is not altered excessively. However, replatforming involves slight changes to the code for the purpose of taking advantage of the new cloud infrastructure. This is a good strategy for organizations that want to build trust in the Cloud while achieving benefits such as increased system performance.

3. REFACTOR

Refactoring involves rebuilding or redeploying the application using cloud-native features. Unlike “Lift and Shift”, a refactored application not only pulls data from cloud storage for analysis but also completes its analytics and computations within the Cloud. Companies that choose to refactor will reuse already existing code and frameworks, but run their applications on a PaaS (Platform-as-a-Service) as done in case of rehosting.

4. REPURCHASE

Repurchasing means moving to a different product. Simply put, organizations can opt to discard their legacy applications altogether and switch to already-build SaaS applications from third-party vendors.. This is cost-effective strategy, but commercial products offers less customization.

5. RETIRE

Retire means that application is explicitly phased out. In case your ERP fails the Cloud feasibility assessment, you must take a call to simple retire it and probably implement a SaaS based ERP.

6. RETAIN

This means “do nothing for now, and revisit later”. If you are unable to take data off premises for compliance reasons, then you must revisit cloud migration when you overcome the challenges or when the required compliance mandates have been received.


You can read more about The 6 R’s of Cloud Migration Strategy here.

Teknita has the expert resources to support all your technology initiatives.
We are always happy to hear from you.

Click here to connect with our experts!