In raster, graphics pixels are used for an image to be drawn. These are pixels that we need for drawing something. It is also known as a bitmap image in which a sequence of images is into smaller pixels. Basically, a bitmap indicates a large number of pixels together
In vector graphics, m It uses mathematicaathematical formulal tulole are used to draw different types of shapes, lines, objects, and so on.
Cyber security means securing our computers, electronic devices, networks , programs, systems from cyber attacks. Cyber attacks are those attacks that happen when our system is connected to the Internet.
Information security means protecting our system’s information from theft, illegal use and piracy from unauthorized use. Information security has mainly three objectives: confidentiality, integrity, and availability of information.
Application security means securing our applications and data so that they don’t get hacked and also the databases of the applications remain safe and private to the owner itself so that user’s data remains confidential.
Network security means securing a network and protecting the user’s information about who is connected through that network. Over the network hackers steal, the packets of data through sniffing and spoofing attacks, man in the middle attack, war driving, etc, and misuse the data for their benefits.
These are the oldest forms of AI systems that have extremely limited capability. They emulate the human mind’s ability to respond to different kinds of stimuli. These machines do not have memory-based functionality. This means such machines cannot use previously gained experiences to inform their present actions, i.e., these machines do not have the ability to “learn.” These machines could only be used for automatically responding to a limited set or combination of inputs.
Almost all present-day AI applications, from chatbots and virtual assistants to self-driving vehicles are all driven by limited memory AI.
Theory of mind AI is the next level of AI systems that researchers are currently engaged in innovating. A theory of mind level AI will be able to better understand the entities it is interacting with by discerning their needs, emotions, beliefs, and thought processes.
This is the final stage of AI development which currently exists only hypothetically. Self-aware AI, which, self explanatorily, is an AI that has evolved to be so akin to the human brain that it has developed self-awareness. Creating this type of Ai, which is decades, if not centuries away from materializing, is and will always be the ultimate objective of all AI research. This type of AI will not only be able to understand and evoke emotions in those it interacts with, but also have emotions, needs, beliefs, and potentially desires of its own. And this is the type of AI that doomsayers of the technology are wary of. Although the development of self-aware can potentially boost our progress as a civilization by leaps and bounds, it can also potentially lead to catastrophe. This is because once self-aware, the AI would be capable of having ideas like self-preservation which may directly or indirectly spell the end for humanity, as such an entity could easily outmaneuver the intellect of any human being and plot elaborate schemes to take over humanity.
This type of artificial intelligence represents all the existing AI, including even the most complicated and capable AI that has ever been created to date. Artificial narrow intelligence refers to AI systems that can only perform a specific task autonomously using human-like capabilities. These machines can do nothing more than what they are programmed to do, and thus have a very limited or narrow range of competencies.
Artificial General Intelligence is the ability of an AI agent to learn, perceive, understand, and function completely like a human being. These systems will be able to independently build multiple competencies and form connections and generalizations across domains, massively cutting down on time needed for training. This will make AI systems just as capable as humans by replicating our multi-functional capabilities.
The development of Artificial Superintelligence will probably mark the pinnacle of AI research, as AGI will become by far the most capable forms of intelligence on earth. ASI, in addition to replicating the multi-faceted intelligence of human beings, will be exceedingly better at everything they do because of overwhelmingly greater memory, faster data processing and analysis, and decision-making capabilities. The development of AGI and ASI will lead to a scenario most popularly referred to as the singularity. And while the potential of having such powerful machines at our disposal seems appealing, these machines may also threaten our existence or at the very least, our way of life.
Quality assurance (QA) engineering is an essential process within the software and product development world. This field involves ensuring that products or services meet a particular set of standards while providing quality feedback to help keep development teams on track.
Video game design involves the careful collaboration of a team of people, including a lead designer, artists, coders, and testers. This field combines elements from various disciplines like drawing, concepting, modeling, story writing, level design, programming, voice acting, and audio engineering to create immersive virtual experiences that millions of people can interact with and enjoy.
Software integration engineering (SIE) is a rapidly growing field that combines software components into unified, working solutions.
Front-end engineering is a type of software development focused on making websites and applications better for the user through improved accessibility, visual design, and increased responsiveness.
Full-stack development in software engineering is the process of developing web applications based on a combination of back-end and front-end technologies.
Linear data structures are the simplest, arranging data in a single level.
Graph data structures and algorithms are key computer science concepts. Graphs are nonlinear data structures. They contain multiple levels of data and do not connect elements sequentially. As a result, graphs enable computing professionals to solve complex problems using data.
Sorting algorithms are step-by-step procedures for rearranging data in arrays and lists. Ex: Insertion Sort
Searching Algorithm find and receive specific elements from the data. For example, we can think of Linear search.
Computer scientists use graph traversal algorithms to search nodes in graphs and trees. Unlike linear computer science data structures, graphs must be searched more than once to find and retrieve data.
Automata theory is the study of abstract machines called automata, which computer scientists use to describe and analyze the behavior of computer systems.
Computability theory, or recursion theory, is the study of what decision problems a computer program can and cannot solve. A decision problem is a yes or no question that can have an infinite number of factors.
Computational complexity focuses on how much time and memory different algorithms require. The more resources the algorithm requires, the more complex it is.
LANs connect computers and peripheral devices in a limited physical area, such as a business office, laboratory, or college campus, by means of links (wires, Ethernet cables, fibre optics, Wi-Fi) that transmit data rapidly.
WANs connect computers and smaller networks to larger networks over greater geographic areas, including different continents. They may link the computers by means of cables, optical fibres, or satellites, but their users commonly access the networks via a modem (a device that allows computers to communicate over telephone lines).
An instruction set architecture, or ISA, is a collection of instructions that a computer processor reads. It outlines how the central processing unit (CPU) is controlled by its software, and effectively acts as the interface between the machine’s hardware components and its software.
Otherlly, called computer organisation, microarchitecture is an important sub-category of computer architecture. There is an inherent interconnection between the microarchitecture and the instruction set architecture, because the microarchitecture outlines how a processor implements its ISA.
System design incorporates all of a computer’s physical hardware elements, such as its data processors, multiprocessors, and graphic processing units (GPUs).
The Neumann model of computer architecture was developed by John von Neumann in the 1940s. It outlines a model of computer architecture with five elements:
Harvard architecture uses separate memory storage for instructions and for data. This differs from, for example, Von Neumann architecture, with programme instructions and data sharing the same memory and pathways.
Single instruction, multiple data processing computers can process multiple data points simultaneously. This paved the way for supercomputers and other high-performance machines, at least until developers at organisations like Intel and IBM started moving into multiple instruction, multiple data (MIMD) models.
Multicore architecture uses a single physical processor to incorporate the core logic of more than one processor, with the aim of creating systems that complete multiple tasks at the same time in the name of optimisation and better overall system performance.
As the name signifies, Predictive Data-Mining analysis works on the data that may help to know what may happen later (or in the future) in business.
The main goal of the Descriptive Data Mining tasks is to summarize or turn given data into relevant information.
The procedural programming language is used to execute a sequence of statements which lead to a result. Typically, this type of programming language uses multiple variables, heavy loops and other elements, which seperates them from functional programming languages
Functional programming language typically uses stored data, frequently avoiding loops in favor of recursive functions.
This programming language views the world as a group of objects that have internal data and external accessing parts of that data.
These programming languages are often procedural and may comprise object-oriented language elements, but they fall into their own category as they are normally not full-fledged programming languages with support for development of large systems. For example, they may not have compile-time type checking. Usually, these languages require tiny syntax to get started.
This language doesn’t tell the computer how to do something, but employing restrictions on what it must consider doing.
The history of computer systems goes way back to Charles Babbage's differential machines. Despite never being completed, this machine is considered the first example of a computing system. It came before the early 20th-century mainframes and large computers. The Von Neumann machine and others of its kind were later used as the first massive, monolithic computers in the human world.
The microprocessor revolution of the 1970s and 1980s saw the introduction of personal computers, also known as desktop computers. The first true home computer that came with a monitor display was released to the public in 1977.
The operating system was initially created to support a complete computer system in a box and to provide users with a common interface for using the software that operated on that hardware.
The laptops emerged as hardware became smaller and more portable over time. The Portal, the first authorized portable microcomputer, debuted in 1980. It was built using an 8-bit, 2 MHz Intel 8085 processor and was equipped with a 64K byte RAM.
The introduction of the modern cloud in the early 2000s revolutionized software distribution and data storage. The out-of-the-box software strategy was rendered obsolete in the enterprise IT sector as software was provided digitally through the internet in place of physical media, such as floppy disks and compact disks.
The concept of hardware and software configurations has recently undergone a radical change thanks to virtualization. Instead of using physical hardware, the majority of current computing systems use virtualized computer systems.
It is the type of database that stores data at a centralized database system. It comforts the users to access the stored data from different locations through several applications.
Cloud security focuses on protecting cloud-based assets and services, including applications, data, and infrastructure. Most cloud security is managed as a shared responsibility between organizations and cloud service providers.
A subset of information security, data security combines many types of cybersecurity solutions to protect the confidentiality, integrity, and availability of digital assets at rest (i.e., while being stored) and in motion (i.e., while being transmitted).
IoT security seeks to minimize the vulnerabilities that these proliferating devices bring to organizations. It uses different types of cybersecurity to detect and classify them, segment them to limit network exposure, and seek to mitigate threats related to unpatched firmware and other related flaws.
Unlike a centralized database system, in distributed systems, data is distributed among different database systems of an organization. These database systems are connected via communication links. Such links help the end-users to access the data easily. Examples of the Distributed database are Apache Cassandra, HBase, Ignite, etc.
The zero trust security model replaces the traditional perimeter-focused approach of building walls around an organization’s critical assets and systems. There are several defining characteristics of the zero trust approach, which leverages many types of cybersecurity.