Noutați

Updated for 2021 LTS: The definitive guide to lighting in the High Definition Render Pipeline

The definitive guide to lighting in the High Definition Render Pipeline (HDRP) is now updated with tips for taking on the latest capabilities in Unity 2021 LTS. Learn how to create high-end lighting with production-ready HDRP features – from Light Anchors to Lens Flares, and beyond. This e-book was initially created and published late last year to demonstrate the power of physically based lighting in HDRP for generating high-end lighting effects across PC and console games. We’ve received positive feedback from our users, and have since updated the guide to include key features in Unity 2021 LTS. This way, it can remain a foundational, advanced-level resource for technical artists, lighting artists, and developers working in Unity.

Read More

Creating games for everyone: Introducing Unity Learn’s new accessibility course

Practical Game Accessibility is a new, free online course for intermediate creators. It’s an introduction to creating games that more players can enjoy. As you work through the course, you’ll learn about prioritizing accessibility while building a game guided by an inclusive design approach. To support this learning journey, we created Out of Circulation – a small, vertical slice of a point-and-click narrative adventure game. You’ll use Out of Circulation as an example case study to explore and expand upon throughout the course. “You’ll work it out, Sureswim,” Old Smalt reassures you as she passes you the apanthometer and sends you on your way. Surely the benevolent tech-witch and her gadgets will help you solve the mystery surrounding the local library. While your sidekick Wink is an expert in eavesdropping, you’re going to need all the support you can get!  Not working on a game? No problem! Although Practical Game Accessibility uses games and game development as its core example, you can also apply much of what you’ll learn to other non-game projects, such as simulations, visualizations, and other real-time applications.

Read More

A Step-By-Step Guide To Cross-Platform App Development

In a world where people are constantly glued to their gadgets, it’s no surprise that businesses are starting to invest more in mobile app development. But with so many different types of devices out there, how can you make sure your app will work on all of them? The answer is cross-platform app development! What is cross-platform app development?  Cross-platform app development is creating software that can be used on multiple platforms. This can be done either by creating separate software versions for each platform or by using a cross-platform development tool that allows the software to be written once and then compiled for each platform.   There are many benefits to developing cross-platform apps, including reaching a wider audience and saving time and resources by only having to create and maintain one codebase. However, some challenges need to be considered, such as ensuring that the software works correctly on all platforms and dealing with any differences in how each platform handles certain features. If you’re thinking about developing a cross-platform app, then check out our step-by-step guide below. We’ll take you through everything you need to know, from choosing the right development tool to testing your app on all devices. RAD Studio – Native and Cross-Platform Development Ecosystem Why develop a cross-platform app? Cross-platform app development has become increasingly popular as more and more businesses look to reach their customers across various devices and platforms. There are several reasons why you might choose to develop a cross-platform app: Reach a wider audience: Developing your app for multiple platforms means that you can reach a more significant number of potential customers. Create Cross-Platform Native Applications with Delphi FireMonkey – Also deploy to Linux system using FMXLinux Cost-effective: Developing a cross-platform app can be more cost-effective than developing separate apps for each platform, as you only need to create one codebase. Build Cross-Platform Native Apps with Delphi FireMonkey Time-saving: Developing a cross-platform app can save you time, as you only need to create one set of code, which can then be used across all platforms. Use Delphi’s award-winning VCL framework for Windows and the FireMonkey (FMX) visual framework for cross-platform responsive UIs and utilize FireMonkey design-time guidelines for Prototype faster with visual lines and enhanced margin and padding support. Moreover, with FireUI’s revolutionary technology you can see what your application’s UI looks like on any device without installing it. Increased flexibility: Cross-platform apps offer increased flexibility, as they can be easily adapted to work on new platforms or devices as they are released. Delphi with FireMonkey framework is the best combination of a programming language and a framework for building cross-platform apps within no time. FireMonkey’s underlying visual control architecture enables multiple presentation implementations per control called ControlTypes in particular, native OS control presentations can be utilised. Furthermore, the underlying architecture is MVC based and provides backwards compatibility to choose at design time between Styled and Platform (OS) specific control types with their native features. By this, you can have smooth scrolling and performance and maintains complete cross-platform fidelity across Windows, macOS, Android, iOS and Linux. If you’re considering developing a cross-platform app, then this guide will give you all the information you need to get started.  What are the different types of cross-platform apps? There are two main types of cross-platform apps: native and […]

Read More

TMS FNC Chart: Visualize your Grid Data

With the release of the new TMS FNC Chart version 2.0, it is now possible to show beautiful graphs based on the cell data in a TMS FNC Grid. This requires the installation of the TMS FNC UI Pack next to TMS FNC Chart. TTMSFNCChartGridAdapter The files for the TTMSFNCChartGridAdapter are available in the directory, but due to the dependencies to the TMS FNC UI Pack, they are not added to the packages by default. If you want to use the component in design-time it will be necessary to add the four different TTMSFNCChartGridAdapter units (and the Reg, DE and RegDE) with the prefix to the respective framework packages. Setup You can quickly set up the environment necessary to start. You need a TTMSFNCChart, a TTMSFNCGrid and the TTMSFNCChartGridAdapter.Link the Adapter property of the TMSFNCChart to the TMSFNCChartGridAdapter.And set the source of the TMSFNCChartGridAdapter to the TMSFNCGrid. You can now already start making your graph. As the default behavior of the TMSFNCChartGridAdapter will try to create a new series for each column that has a value in the first line. If it finds a text value in the first row it will use the first column it encounters as the x-value text. TTMSFNCChartGridAdapterSeries If you are using a more uncommon positioning of your data or you want to use multi-point data, you can use the TMSFNCChartGridAdapterSeries. This collection contains items for the different series that should be drawn in the chart.For this you’ll need to set the AutoCreateSeries property of the TMSFNCChartAdapter to False and add a new series item.There you can set the columns to use from your TMSFNCGrid. Now we set some the data in the cells. And set the format type of the X-axis to vftDateTime so it tries to parse the cell to a TDateTime format. Because the adapter only triggers when a cell has been edited, we need to call the Synchronize procedure when changing cells in run-time. procedure TForm.FormCreate(Sender: TObject); begin TMSFNCGrid.Cells[1,0] := ‘Date’; TMSFNCGrid.Cells[2,0] := ‘Low’; TMSFNCGrid.Cells[3,0] := ‘High’; TMSFNCGrid.Cells[4,0] := ‘Open’; TMSFNCGrid.Cells[5,0] := ‘Close’; TMSFNCGrid.Cells[1,1] := ‘7/06/2022’; TMSFNCGrid.Cells[1,2] := ‘8/06/2022’; TMSFNCGrid.Cells[1,3] := ‘9/06/2022′; TMSFNCGrid.Cells[1,4] := ’10/06/2022′; TMSFNCGrid.Cells[1,5] := ’11/06/2022’; TMSFNCGrid.Cells[2,1] := ‘24.8’; TMSFNCGrid.Cells[2,2] := ‘23.8’; TMSFNCGrid.Cells[2,3] := ‘22.8’; TMSFNCGrid.Cells[2,4] := ‘21.8’; TMSFNCGrid.Cells[2,5] := ‘20.8’; TMSFNCGrid1.Cells[3,1] := ‘32.5’; TMSFNCGrid1.Cells[3,2] := ‘30.2’; TMSFNCGrid1.Cells[3,3] := ‘34.6’; TMSFNCGrid1.Cells[3,4] := ‘30.2’; TMSFNCGrid1.Cells[3,5] := ‘33.7’; TMSFNCGrid.Cells[4,1] := ‘27.3’; TMSFNCGrid.Cells[4,2] := ‘25.3’; TMSFNCGrid.Cells[4,3] := ‘28.0’; TMSFNCGrid.Cells[4,4] := ‘30.1’; TMSFNCGrid.Cells[4,5] := ‘30.0’; TMSFNCGrid.Cells[5,1] := ‘25.3’; TMSFNCGrid.Cells[5,2] := ‘28.1’; TMSFNCGrid.Cells[5,3] := ‘30.2’; TMSFNCGrid.Cells[5,4] := ‘30.2’; TMSFNCGrid.Cells[5,5] := ‘24.6’; TMSFNCChart.DefaultLoadOptions.XValuesFormatType := vftDateTime; TMSFNCChartGridAdapter.Synchronize; end; procedure TForm.TMSFNCChartGridAdapterSynchronized(Sender: TObject); begin TMSFNCChart.Series[0].ChartType := ctCandleStick; TMSFNCChart.Series[0].MinXOffsetPercentage := 10; TMSFNCChart.Series[0].MaxXOffsetPercentage := 10; end; Events to customize the data To make it possible to manipulate the grid data some more, we’ve added a couple of events so you can set the chart to your liking.I’ve changed the TTMSFNCChart to a TTMSFNCBarChart, but it’s also possible to set the chart type of the series in the Synchronized event. // Example data procedure TForm.FormCreate(Sender: TObject); begin TMSFNCGrid.Cells[1,0] := ‘Prod-Num’; TMSFNCGrid.Cells[2,0] := ‘Product’; TMSFNCGrid.Cells[3,0] := ‘Price’; TMSFNCGrid.Cells[4,0] := ‘Volume’; TMSFNCGrid.Cells[5,0] := ‘Color’; TMSFNCGrid.Cells[1,1] := ‘CA-J-123.45’; TMSFNCGrid.Cells[1,2] := ‘CA-S-155.78’; TMSFNCGrid.Cells[1,3] := ‘CA-S-267.36’; TMSFNCGrid.Cells[1,4] := ‘CA-D-102.10’; TMSFNCGrid.Cells[1,5] := ‘CA-S-403.48’; TMSFNCGrid.Cells[2,1] := ‘Jeans’; TMSFNCGrid.Cells[2,2] := ‘Shirts’; TMSFNCGrid.Cells[2,3] := ‘Sweaters’; TMSFNCGrid.Cells[2,4] := ‘Dresses’; TMSFNCGrid.Cells[2,5] := ‘Shoes’; TMSFNCGrid.Cells[3,1] := ‘48.99’; TMSFNCGrid.Cells[3,2] := ‘26.99’; TMSFNCGrid.Cells[3,3] := ‘34.99’; TMSFNCGrid.Cells[3,4] […]

Read More

Display and edit math formulae using the new math editor component

With the latest update of TMS FNC WX Pack, we released the new TTMSFNCWXMathEditor component. This component let’s you easily display and edit math formulae. No code is required to get started. Simply drop it on the form and you’re ready to go.  Virtual Keyboards Not only can you directly type in the math formula you want, you can also use virtual keyboards which are available in the component. There are multiple keyboards available to use like one for functions or one to type symbols with. These keyboards are customizable. You can set a theme, which keyboards to display and even when to show the virtual keyboard (i.e. on focus, manual, off, auto). LaTeX You can also type LaTeX commands to render your formula. Currently, there are over 800 LaTeX commands supported. The editor even gives you a visual aid when starting to type some of the commands. You can use LaTeX directly in the editor or use the Math property to set a default value to be shown. You can also export the typed in formula in LaTeX or in several other formats like spoken or ASCIImath.  Video Watch this video to get you started with the TTMSFNCWXMathEditor.

Read More

Learn Python with Pj! Part 5 – Build a hashtag tracker with the Twitter API

This is the fifth and final installment in the Learn Python with Pj! series. Make sure to read: Putting it all together I’ve completed my Python course on Codecademy, and am excited to put the skills I learned into building something practical. I’ve worked with the Twitter API before; I wrote a few bots in Node.js to make them tweet and respond to tweets they’re tagged in. I thought it’d be fun to work with the API again, but this time do it in Python. I didn’t just want to make another bot, so I had to figure out something else. In this case, I made a bot that can track hashtags being used in real time on Twitter. Here’s my repo containing a few different files, but live_tweets.py is what we’ll focus on for this blog. Let’s talk about how I built it and what it does. import tweepy import config auth = tweepy.OAuth1UserHandler(config.consumer_key, config.consumer_secret, config.access_token, config.access_token_secret ) api = tweepy.API(auth) #prints the text of the tweet using hashtag designated in stream.filter(track=[]) class LogTweets(tweepy.Stream): def on_status(self, status): date = status.created_at username = status.user.screen_name try: tweet = status.extended_tweet[“full_text”] except AttributeError: tweet = status.text print(“**Tweet info**”) print(f”Date: {date}”) print(f”Username: {username}”) print(f”Tweet: {tweet}”) print(“*********”) print(“********* n”) if __name__ == “__main__”: #creates instance of LogTweets with authentication stream = LogTweets(config.consumer_key, config.consumer_secret, config.access_token, config.access_token_secret) #hashtags as str in list will be watched live on twitter. hashtags = [] print(“Looking for Hashtags…”) stream.filter(track=hashtags) Here’s how this all works. First, we import two modules: Tweepy and config. Tweepy is a wrapper that makes using the Twitter API very easy. Config allows us to use config files and keep our secrets safe. This is important since using the Twitter API involves four keys that are specific to your Twitter developer account. Getting these keys is covered in this Twitter documentation. We’ll talk about what’s in the config file and how it works later. The next line defines the variable auth using tweepy’s built in authorization handler. Normally, you’d put in the keys directly here, but since we’re trying to keep secrets safe, we handle those through the config file. In order to call those variables hosted in the config file, we type config.variable_name. Finally, in order to access the tweepy api, we create the variable api with the auth variable from the line above passed into tweepy.API(). Now, the variable api will give us access to all the features in Tweepy’s Twitter API library. You’re invited! Join us on June 23rd for the GitLab 15 launch event with DevOps guru Gene Kim and several GitLab leaders. They’ll show you what they see for the future of DevOps and The One DevOps Platform. For our purposes, we want to find a hashtag being used, then collect the tweet that used it and print some information about the tweet to the console. To make this happen, we’ve created a class called LogTweets that takes an input tweepy.Stream. Stream is a Twitter API term that refers to all of the tweets being posted on Twitter at any given moment. Think of it as opening a window looking out onto every single tweet as it’s posted. We have to make this open connection in order to be able to find tweets that are using our hashtag. Inside LogTweets, we define a […]

Read More

Ski first, work later – How to win the burnout battle

It’s 9:13 am and 20 degrees outside in Big Sky, Montana. I’m bundled up in my warm rainbow pride ski suit. Dangling 30 feet in the crisp air, perched on a ski lift, I’m on my way up to a double black diamond run 9,382 feet above sea level. There are few people out this early on a Wednesday morning. I ski off the top of the lift and enjoy a beautifully untracked run of champagne powder snow, fresh from last night’s snowstorm. This is a normal start to the workday for me. And I have a bit of a secret to admit, this is exactly why I joined GitLab. Something’s gotta give Rewind two years to January 2020, before I joined GitLab. Before I had materialized my daily skiing routine. Before I moved to Big Sky. Before the global Covid-19 pandemic. I had decided I needed to make a change in my life. I had spent the past decade of my life climbing the startup tech career ladder. Along the way I had sacrificed my health, happiness, and my mental and emotional well-being. I was burnt out. While I don’t think I’d change anything going back, I knew the next decade wouldn’t sustain that lack of work and life balance. I needed to get back to being the person my friends and family knew: a slim guy with a smile always on his face and a hopeful outlook for the future. You’re invited! Join us on June 23rd for the GitLab 15 launch event with DevOps guru Gene Kim and several GitLab leaders. They’ll show you what they see for the future of DevOps and The One DevOps Platform. A remote change GitLab had been on my radar for a number of years as many of my tech friends had become DevOps engineers, but I had not used it myself. What I did know was at the time they were one of the few truly remote companies with no offices and a global team embracing an async work style. While I hadn’t ever worked remotely before, I knew I liked the idea of not being stuck in a bland office of noisy and distracting open floor layout workspaces surrounded by silly ping pong tables and unlimited snacks. My previous employers thought these things made for a ‘supportive environment and ‘great work culture’. I couldn’t disagree more. It was a scary thought to have less structure, but my previous decade had shown me those offices weren’t conducive to my sanity, happiness, or productivity. So I decided, let’s go all in. I knew I wanted to make a big change, so I tested GitLab when I was interviewing. I gauged reactions from my interview panel as I described my desire to move to a ski mountain and balance working and skiing. I was caught by surprise. Every person I interviewed with loved this idea and encouraged me that GitLab’s remote and async working style would be supportive of this plan. Just about everyone had a story of how they themselves had adjusted their schedule to add flexibility to their lives. I was convinced. This was the future. A global pandemic Two months after joining GitLab in January 2020, the pandemic ruined my plans to relocate to a wintry wonderland. I delayed […]

Read More

GitLab Heroes Unmasked – How I became acquainted with the GitLab Agent for Kubernetes

A key to GitLab’s success is our vast community of advocates. Here at GitLab, we call these active contributors “GitLab Heroes.” Each hero contributes to GitLab in numerous ways, including elevating releases, sharing best practices, speaking at events, and more. Jean-Phillippe Baconnais is an active GitLab Hero, who hails from France. We applaud his contributions, including leading community engagement events. Baconnais shares his interest in Kubernetes and explains how to deploy and monitor an application in Kubernetes without leaving GitLab. Since 2007, I’ve been a developer. I’ve learned a lot of things about continuous integration, deployment, infrastructure, and monitoring. In both my professional and personal time, my favorite activity remains software development. After creating a new application with multiple components, I wanted to deploy it on Kubernetes, which has been really famous over the last few years. This allows me to experiment on this platform. This announces a lot of very funny things. I know some terms, I used them in production for five years. But as a user, Kubernetes Administration is not my “cup of tea” 😅. My first deployment in Kubernetes When I decided to deploy an application on Kubernetes, I wasn’t sure where to start until I saw, navigating in my project in GitLab, a menu called “Kubernetes.” I wanted to know what GitLab was hiding behind this. Does this feature link my project’s sources to a Kubernetes cluster? I used the credit offered by Google Cloud to discover and test this platform. Deploying my application on Kubernetes was easy. I wrote a blog post in 2019 describing how I do this, or rather, how GitLab helped me to create this link so easily. In this blog post I will explain further and talk about what’s changed since then. Behind the “Kubernetes” menu, GitLab helps you integrate Kubernetes into your project. You can create, from GitLab, a cluster on Google Cloud Platform (GCP), and Amazon Web Services (AWS). If you already have a cluster on this platform or anywhere else, you can connect to it. You just need to specify the cluster name, Kubernetes API UR, and certificate. GitLab is a DevOps platform and in the list of DevOps actions, there is the monitoring part. GitLab deploys an instance of Prometheus to get information about your cluster and facilitate the monitoring of your application. For example, you can see how many pods are deployed and their states in your environment. You can also view some charts and information about your cluster, like memory and CPU available. All these metrics are available by default without changing the application of your cluster. We can also read the logs directly in GitLab. For a developer, it’s great to have all this information on the same tool and this allows us to save time. A new way to integrate Kubernetes Everything I explained in the previous chapter doesn’t quite exist anymore. The release of GitLab 14.5 was the beginning of a revolution. The Kubernetes integration with certificates has limitations on security and many issues were created. GitLab teams worked on a new way to rely on your cluster. And in Version 14.5, the GitLab Agent for Kubernetes was released! GitLab Agent for Kubernetes GitLab Agent for Kubernetes is a new way to connect to your cluster. This solution is easy to […]

Read More

How to automate software delivery using Quarkus and GitLab

In this day and age, organizations need to deliver innovative solutions faster than ever to their customers to stay competitive. This is why solutions that speed up software development and delivery, such as Quarkus and GitLab, are being adopted by teams across the world. Quarkus, also known as the Supersonic Subatomic Java, is an open source Kubernetes-native Java stack tailored for OpenJDK HotSpot and GraalVM, crafted from respected Java libraries and standards. Quarkus has been steadily growing in popularity and use because of the benefits that it delivers: cost savings, faster time to market/value, and reliability. Quarkus offers two modes: Java and native. Its Java mode builds your application using the JDK and its native mode compiles your Java code into a native executable. GitLab, the One DevOps Platform, includes capabilities for all DevOps stages, from planning to production, all with a single model and user interface to help you ship secure code faster to any cloud and drive business results. Besides DevOps support, GitLab also offers GitOps support. The combination of Quarkus and GitLab can empower your developers and operations teams to collaborate better, spend more time innovating to deliver business value and differentiating capabilities to end users. In this article, we show how to automate the software delivery of a generated Quarkus application in Java mode using GitLab Auto DevOps. Below we list the steps how to accomplish this. Prerequisite The prerequisite for the subsequent instructions is to have a K8s cluster up and running and associated to a group in your GitLab account. For an example on how to do this, please watch this video. Generate your Quarkus project using the generator and upload to GitLab From a browser window, point to the Quarkus generator site, https://code.quarkus.io, and click on the button Generate your application. Generate a sample Quarkus application using the generator On the popup window, click on the button DOWNLOAD THE ZIP, to download a sample Quarkus application in a ZIP file to your local machine. The downloaded file is named code-with-quarkus.zip. Unzip the file on your local machine in a directory of your choice. This will create a new directory called code-with-quarkus with all the files for the sample Quarkus application. From a browser window, open https://gitlab.com, and log in using your GitLab credentials. Head over to the GitLab group to which you associated your K8s cluster and create a blank project named code-with-quarkus. Create project code-with-quarkus From a Terminal window on your local machine, change directory to the newly unzipped directory code-with-quarkus and execute the command rm .dockerignore to delete the .dockerignore file that came with the sample Quarkus application. After removing this file, execute the following commands to populate your newly create Git project code-with-quarkus with the contents of this directory: NOTE: Depending on your version of git installed on your local machine, the commands below may vary. Keep in mind that the goal of the steps below is to upload the project on your local machine to your newly created GitLab project. git init git remote add origin https://gitlab.com/[REPLACE WITH PATH TO YOUR GROUP]/code-with-quarkus.git git add . git commit -m “Initial commit” git push –set-upstream origin master At this point, you should have your sample Quarkus application in your GitLab project code-with-quarkus. Modify the generated Dockerfile.jvm file and indicate its location […]

Read More

Why You’re Failing At React Grid View

Grid View is an important element for modern websites because it allows you to provide large volumes of information to the user.  If you are using React for the front-end, you must consider implementing a grid that provides all the functionalities you want. However, suppose you find that your React grid view fails at achieving the speed and user experience you want. In that case, it is high time you consider switching to the Sencha GRUI, which provides a rich development and user experience. If you are a React developer looking forward to embedding a grid into your applications, it is important to know why you are failing your React grid layout and how to use Sencha GRUI for better performance.  Is Your React Grid View Failing To Load Data Efficiently? Creating Grids is a fun Job. However, there are some secrets about JS grids that you might want to know. Usually, we fill grids with a lot of data. Therefore, how efficiently the grid loads, the data is important when embedding a grid view to your websites. Suppose the grid takes 2-3 minutes to load the complete data set. However, using React grid view, creating a grid with only a millisecond of load time from scratch can be tedious, especially if you are under a tight schedule. So this can be something you could miss when building your grids with React grid view, which can lead to your project failure. You need a better Grid view that handles efficient data loading on your behalf. Then you never need to worry about that.  Does Your React Grid View Fail To Provide All The Functionalities Your Customers Expect? Not only the efficiency but also the functionalities offered by your data grid matters most for your project’s success. Suppose your grid can only provide basic functionalities like sorting and searching but cannot provide more intuitive features like pagination and infinite scrolling. In that case, there is a possibility that it cannot extend to provide more advanced features. With time, customer requirements can also be changed. Therefore, your react grid view can fail with time if it is not easily adjustable and needs several plugins to provide additional functionalities. If you want to see examples of successful JS grids, this article might help you.  Is Your React Grid View Cannot Be Customized Easily? Customizations are important when you work with any web component. Suppose your React grid view needs to be used on another page with some customizations. Can you easily achieve it with minimal impact on your code? If not, you will have to do additional work to support a customized grid whenever you want something different. Therefore, it is better to avoid such implementations at all costs and look for a better solution that enables you to do customizations without hassle. This is where third-party Javascript frameworks like Sencha can help you.  Is Your React Grid View Failing To Handle Your Growing Data Set? Data are bound to increase with time. If you have only thousands of data at your hand, you could have millions of data within the next couple of months or years. Therefore, your React good must also accommodate this growing data without affecting the loading speed. Your React grid can fail if you cannot easily improve the functionalities with […]

Read More