From the blog

TMS FNC Chart: Visualize your Grid Data

With the release of the new TMS FNC Chart version 2.0, it is now possible to show beautiful graphs based on the cell data in a TMS FNC Grid. This requires the installation of the TMS FNC UI Pack next to TMS FNC Chart. TTMSFNCChartGridAdapter The files for the TTMSFNCChartGridAdapter are available in the directory, but due to the dependencies to the TMS FNC UI Pack, they are not added to the packages by default. If you want to use the component in design-time it will be necessary to add the four different TTMSFNCChartGridAdapter units (and the Reg, DE and RegDE) with the prefix to the respective framework packages. Setup You can quickly set up the environment necessary to start. You need a TTMSFNCChart, a TTMSFNCGrid and the TTMSFNCChartGridAdapter.Link the Adapter property of the TMSFNCChart to the TMSFNCChartGridAdapter.And set the source of the TMSFNCChartGridAdapter to the TMSFNCGrid. You can now already start making your graph. As the default behavior of the TMSFNCChartGridAdapter will try to create a new series for each column that has a value in the first line. If it finds a text value in the first row it will use the first column it encounters as the x-value text. TTMSFNCChartGridAdapterSeries If you are using a more uncommon positioning of your data or you want to use multi-point data, you can use the TMSFNCChartGridAdapterSeries. This collection contains items for the different series that should be drawn in the chart.For this you’ll need to set the AutoCreateSeries property of the TMSFNCChartAdapter to False and add a new series item.There you can set the columns to use from your TMSFNCGrid. Now we set some the data in the cells. And set the format type of the X-axis to vftDateTime so it tries to parse the cell to a TDateTime format. Because the adapter only triggers when a cell has been edited, we need to call the Synchronize procedure when changing cells in run-time. procedure TForm.FormCreate(Sender: TObject); begin TMSFNCGrid.Cells[1,0] := ‘Date’; TMSFNCGrid.Cells[2,0] := ‘Low’; TMSFNCGrid.Cells[3,0] := ‘High’; TMSFNCGrid.Cells[4,0] := ‘Open’; TMSFNCGrid.Cells[5,0] := ‘Close’; TMSFNCGrid.Cells[1,1] := ‘7/06/2022’; TMSFNCGrid.Cells[1,2] := ‘8/06/2022’; TMSFNCGrid.Cells[1,3] := ‘9/06/2022′; TMSFNCGrid.Cells[1,4] := ’10/06/2022′; TMSFNCGrid.Cells[1,5] := ’11/06/2022’; TMSFNCGrid.Cells[2,1] := ‘24.8’; TMSFNCGrid.Cells[2,2] := ‘23.8’; TMSFNCGrid.Cells[2,3] := ‘22.8’; TMSFNCGrid.Cells[2,4] := ‘21.8’; TMSFNCGrid.Cells[2,5] := ‘20.8’; TMSFNCGrid1.Cells[3,1] := ‘32.5’; TMSFNCGrid1.Cells[3,2] := ‘30.2’; TMSFNCGrid1.Cells[3,3] := ‘34.6’; TMSFNCGrid1.Cells[3,4] := ‘30.2’; TMSFNCGrid1.Cells[3,5] := ‘33.7’; TMSFNCGrid.Cells[4,1] := ‘27.3’; TMSFNCGrid.Cells[4,2] := ‘25.3’; TMSFNCGrid.Cells[4,3] := ‘28.0’; TMSFNCGrid.Cells[4,4] := ‘30.1’; TMSFNCGrid.Cells[4,5] := ‘30.0’; TMSFNCGrid.Cells[5,1] := ‘25.3’; TMSFNCGrid.Cells[5,2] := ‘28.1’; TMSFNCGrid.Cells[5,3] := ‘30.2’; TMSFNCGrid.Cells[5,4] := ‘30.2’; TMSFNCGrid.Cells[5,5] := ‘24.6’; TMSFNCChart.DefaultLoadOptions.XValuesFormatType := vftDateTime; TMSFNCChartGridAdapter.Synchronize; end; procedure TForm.TMSFNCChartGridAdapterSynchronized(Sender: TObject); begin TMSFNCChart.Series[0].ChartType := ctCandleStick; TMSFNCChart.Series[0].MinXOffsetPercentage := 10; TMSFNCChart.Series[0].MaxXOffsetPercentage := 10; end; Events to customize the data To make it possible to manipulate the grid data some more, we’ve added a couple of events so you can set the chart to your liking.I’ve changed the TTMSFNCChart to a TTMSFNCBarChart, but it’s also possible to set the chart type of the series in the Synchronized event. // Example data procedure TForm.FormCreate(Sender: TObject); begin TMSFNCGrid.Cells[1,0] := ‘Prod-Num’; TMSFNCGrid.Cells[2,0] := ‘Product’; TMSFNCGrid.Cells[3,0] := ‘Price’; TMSFNCGrid.Cells[4,0] := ‘Volume’; TMSFNCGrid.Cells[5,0] := ‘Color’; TMSFNCGrid.Cells[1,1] := ‘CA-J-123.45’; TMSFNCGrid.Cells[1,2] := ‘CA-S-155.78’; TMSFNCGrid.Cells[1,3] := ‘CA-S-267.36’; TMSFNCGrid.Cells[1,4] := ‘CA-D-102.10’; TMSFNCGrid.Cells[1,5] := ‘CA-S-403.48’; TMSFNCGrid.Cells[2,1] := ‘Jeans’; TMSFNCGrid.Cells[2,2] := ‘Shirts’; TMSFNCGrid.Cells[2,3] := ‘Sweaters’; TMSFNCGrid.Cells[2,4] := ‘Dresses’; TMSFNCGrid.Cells[2,5] := ‘Shoes’; TMSFNCGrid.Cells[3,1] := ‘48.99’; TMSFNCGrid.Cells[3,2] := ‘26.99’; TMSFNCGrid.Cells[3,3] := ‘34.99’; TMSFNCGrid.Cells[3,4] […]

Read More

Display and edit math formulae using the new math editor component

With the latest update of TMS FNC WX Pack, we released the new TTMSFNCWXMathEditor component. This component let’s you easily display and edit math formulae. No code is required to get started. Simply drop it on the form and you’re ready to go.  Virtual Keyboards Not only can you directly type in the math formula you want, you can also use virtual keyboards which are available in the component. There are multiple keyboards available to use like one for functions or one to type symbols with. These keyboards are customizable. You can set a theme, which keyboards to display and even when to show the virtual keyboard (i.e. on focus, manual, off, auto). LaTeX You can also type LaTeX commands to render your formula. Currently, there are over 800 LaTeX commands supported. The editor even gives you a visual aid when starting to type some of the commands. You can use LaTeX directly in the editor or use the Math property to set a default value to be shown. You can also export the typed in formula in LaTeX or in several other formats like spoken or ASCIImath.  Video Watch this video to get you started with the TTMSFNCWXMathEditor.

Read More

Learn Python with Pj! Part 5 – Build a hashtag tracker with the Twitter API

This is the fifth and final installment in the Learn Python with Pj! series. Make sure to read: Putting it all together I’ve completed my Python course on Codecademy, and am excited to put the skills I learned into building something practical. I’ve worked with the Twitter API before; I wrote a few bots in Node.js to make them tweet and respond to tweets they’re tagged in. I thought it’d be fun to work with the API again, but this time do it in Python. I didn’t just want to make another bot, so I had to figure out something else. In this case, I made a bot that can track hashtags being used in real time on Twitter. Here’s my repo containing a few different files, but live_tweets.py is what we’ll focus on for this blog. Let’s talk about how I built it and what it does. import tweepy import config auth = tweepy.OAuth1UserHandler(config.consumer_key, config.consumer_secret, config.access_token, config.access_token_secret ) api = tweepy.API(auth) #prints the text of the tweet using hashtag designated in stream.filter(track=[]) class LogTweets(tweepy.Stream): def on_status(self, status): date = status.created_at username = status.user.screen_name try: tweet = status.extended_tweet[“full_text”] except AttributeError: tweet = status.text print(“**Tweet info**”) print(f”Date: {date}”) print(f”Username: {username}”) print(f”Tweet: {tweet}”) print(“*********”) print(“********* n”) if __name__ == “__main__”: #creates instance of LogTweets with authentication stream = LogTweets(config.consumer_key, config.consumer_secret, config.access_token, config.access_token_secret) #hashtags as str in list will be watched live on twitter. hashtags = [] print(“Looking for Hashtags…”) stream.filter(track=hashtags) Here’s how this all works. First, we import two modules: Tweepy and config. Tweepy is a wrapper that makes using the Twitter API very easy. Config allows us to use config files and keep our secrets safe. This is important since using the Twitter API involves four keys that are specific to your Twitter developer account. Getting these keys is covered in this Twitter documentation. We’ll talk about what’s in the config file and how it works later. The next line defines the variable auth using tweepy’s built in authorization handler. Normally, you’d put in the keys directly here, but since we’re trying to keep secrets safe, we handle those through the config file. In order to call those variables hosted in the config file, we type config.variable_name. Finally, in order to access the tweepy api, we create the variable api with the auth variable from the line above passed into tweepy.API(). Now, the variable api will give us access to all the features in Tweepy’s Twitter API library. You’re invited! Join us on June 23rd for the GitLab 15 launch event with DevOps guru Gene Kim and several GitLab leaders. They’ll show you what they see for the future of DevOps and The One DevOps Platform. For our purposes, we want to find a hashtag being used, then collect the tweet that used it and print some information about the tweet to the console. To make this happen, we’ve created a class called LogTweets that takes an input tweepy.Stream. Stream is a Twitter API term that refers to all of the tweets being posted on Twitter at any given moment. Think of it as opening a window looking out onto every single tweet as it’s posted. We have to make this open connection in order to be able to find tweets that are using our hashtag. Inside LogTweets, we define a […]

Read More

Ski first, work later – How to win the burnout battle

It’s 9:13 am and 20 degrees outside in Big Sky, Montana. I’m bundled up in my warm rainbow pride ski suit. Dangling 30 feet in the crisp air, perched on a ski lift, I’m on my way up to a double black diamond run 9,382 feet above sea level. There are few people out this early on a Wednesday morning. I ski off the top of the lift and enjoy a beautifully untracked run of champagne powder snow, fresh from last night’s snowstorm. This is a normal start to the workday for me. And I have a bit of a secret to admit, this is exactly why I joined GitLab. Something’s gotta give Rewind two years to January 2020, before I joined GitLab. Before I had materialized my daily skiing routine. Before I moved to Big Sky. Before the global Covid-19 pandemic. I had decided I needed to make a change in my life. I had spent the past decade of my life climbing the startup tech career ladder. Along the way I had sacrificed my health, happiness, and my mental and emotional well-being. I was burnt out. While I don’t think I’d change anything going back, I knew the next decade wouldn’t sustain that lack of work and life balance. I needed to get back to being the person my friends and family knew: a slim guy with a smile always on his face and a hopeful outlook for the future. You’re invited! Join us on June 23rd for the GitLab 15 launch event with DevOps guru Gene Kim and several GitLab leaders. They’ll show you what they see for the future of DevOps and The One DevOps Platform. A remote change GitLab had been on my radar for a number of years as many of my tech friends had become DevOps engineers, but I had not used it myself. What I did know was at the time they were one of the few truly remote companies with no offices and a global team embracing an async work style. While I hadn’t ever worked remotely before, I knew I liked the idea of not being stuck in a bland office of noisy and distracting open floor layout workspaces surrounded by silly ping pong tables and unlimited snacks. My previous employers thought these things made for a ‘supportive environment and ‘great work culture’. I couldn’t disagree more. It was a scary thought to have less structure, but my previous decade had shown me those offices weren’t conducive to my sanity, happiness, or productivity. So I decided, let’s go all in. I knew I wanted to make a big change, so I tested GitLab when I was interviewing. I gauged reactions from my interview panel as I described my desire to move to a ski mountain and balance working and skiing. I was caught by surprise. Every person I interviewed with loved this idea and encouraged me that GitLab’s remote and async working style would be supportive of this plan. Just about everyone had a story of how they themselves had adjusted their schedule to add flexibility to their lives. I was convinced. This was the future. A global pandemic Two months after joining GitLab in January 2020, the pandemic ruined my plans to relocate to a wintry wonderland. I delayed […]

Read More

GitLab Heroes Unmasked – How I became acquainted with the GitLab Agent for Kubernetes

A key to GitLab’s success is our vast community of advocates. Here at GitLab, we call these active contributors “GitLab Heroes.” Each hero contributes to GitLab in numerous ways, including elevating releases, sharing best practices, speaking at events, and more. Jean-Phillippe Baconnais is an active GitLab Hero, who hails from France. We applaud his contributions, including leading community engagement events. Baconnais shares his interest in Kubernetes and explains how to deploy and monitor an application in Kubernetes without leaving GitLab. Since 2007, I’ve been a developer. I’ve learned a lot of things about continuous integration, deployment, infrastructure, and monitoring. In both my professional and personal time, my favorite activity remains software development. After creating a new application with multiple components, I wanted to deploy it on Kubernetes, which has been really famous over the last few years. This allows me to experiment on this platform. This announces a lot of very funny things. I know some terms, I used them in production for five years. But as a user, Kubernetes Administration is not my “cup of tea” 😅. My first deployment in Kubernetes When I decided to deploy an application on Kubernetes, I wasn’t sure where to start until I saw, navigating in my project in GitLab, a menu called “Kubernetes.” I wanted to know what GitLab was hiding behind this. Does this feature link my project’s sources to a Kubernetes cluster? I used the credit offered by Google Cloud to discover and test this platform. Deploying my application on Kubernetes was easy. I wrote a blog post in 2019 describing how I do this, or rather, how GitLab helped me to create this link so easily. In this blog post I will explain further and talk about what’s changed since then. Behind the “Kubernetes” menu, GitLab helps you integrate Kubernetes into your project. You can create, from GitLab, a cluster on Google Cloud Platform (GCP), and Amazon Web Services (AWS). If you already have a cluster on this platform or anywhere else, you can connect to it. You just need to specify the cluster name, Kubernetes API UR, and certificate. GitLab is a DevOps platform and in the list of DevOps actions, there is the monitoring part. GitLab deploys an instance of Prometheus to get information about your cluster and facilitate the monitoring of your application. For example, you can see how many pods are deployed and their states in your environment. You can also view some charts and information about your cluster, like memory and CPU available. All these metrics are available by default without changing the application of your cluster. We can also read the logs directly in GitLab. For a developer, it’s great to have all this information on the same tool and this allows us to save time. A new way to integrate Kubernetes Everything I explained in the previous chapter doesn’t quite exist anymore. The release of GitLab 14.5 was the beginning of a revolution. The Kubernetes integration with certificates has limitations on security and many issues were created. GitLab teams worked on a new way to rely on your cluster. And in Version 14.5, the GitLab Agent for Kubernetes was released! GitLab Agent for Kubernetes GitLab Agent for Kubernetes is a new way to connect to your cluster. This solution is easy to […]

Read More

How to automate software delivery using Quarkus and GitLab

In this day and age, organizations need to deliver innovative solutions faster than ever to their customers to stay competitive. This is why solutions that speed up software development and delivery, such as Quarkus and GitLab, are being adopted by teams across the world. Quarkus, also known as the Supersonic Subatomic Java, is an open source Kubernetes-native Java stack tailored for OpenJDK HotSpot and GraalVM, crafted from respected Java libraries and standards. Quarkus has been steadily growing in popularity and use because of the benefits that it delivers: cost savings, faster time to market/value, and reliability. Quarkus offers two modes: Java and native. Its Java mode builds your application using the JDK and its native mode compiles your Java code into a native executable. GitLab, the One DevOps Platform, includes capabilities for all DevOps stages, from planning to production, all with a single model and user interface to help you ship secure code faster to any cloud and drive business results. Besides DevOps support, GitLab also offers GitOps support. The combination of Quarkus and GitLab can empower your developers and operations teams to collaborate better, spend more time innovating to deliver business value and differentiating capabilities to end users. In this article, we show how to automate the software delivery of a generated Quarkus application in Java mode using GitLab Auto DevOps. Below we list the steps how to accomplish this. Prerequisite The prerequisite for the subsequent instructions is to have a K8s cluster up and running and associated to a group in your GitLab account. For an example on how to do this, please watch this video. Generate your Quarkus project using the generator and upload to GitLab From a browser window, point to the Quarkus generator site, https://code.quarkus.io, and click on the button Generate your application. Generate a sample Quarkus application using the generator On the popup window, click on the button DOWNLOAD THE ZIP, to download a sample Quarkus application in a ZIP file to your local machine. The downloaded file is named code-with-quarkus.zip. Unzip the file on your local machine in a directory of your choice. This will create a new directory called code-with-quarkus with all the files for the sample Quarkus application. From a browser window, open https://gitlab.com, and log in using your GitLab credentials. Head over to the GitLab group to which you associated your K8s cluster and create a blank project named code-with-quarkus. Create project code-with-quarkus From a Terminal window on your local machine, change directory to the newly unzipped directory code-with-quarkus and execute the command rm .dockerignore to delete the .dockerignore file that came with the sample Quarkus application. After removing this file, execute the following commands to populate your newly create Git project code-with-quarkus with the contents of this directory: NOTE: Depending on your version of git installed on your local machine, the commands below may vary. Keep in mind that the goal of the steps below is to upload the project on your local machine to your newly created GitLab project. git init git remote add origin https://gitlab.com/[REPLACE WITH PATH TO YOUR GROUP]/code-with-quarkus.git git add . git commit -m “Initial commit” git push –set-upstream origin master At this point, you should have your sample Quarkus application in your GitLab project code-with-quarkus. Modify the generated Dockerfile.jvm file and indicate its location […]

Read More

Why You’re Failing At React Grid View

Grid View is an important element for modern websites because it allows you to provide large volumes of information to the user.  If you are using React for the front-end, you must consider implementing a grid that provides all the functionalities you want. However, suppose you find that your React grid view fails at achieving the speed and user experience you want. In that case, it is high time you consider switching to the Sencha GRUI, which provides a rich development and user experience. If you are a React developer looking forward to embedding a grid into your applications, it is important to know why you are failing your React grid layout and how to use Sencha GRUI for better performance.  Is Your React Grid View Failing To Load Data Efficiently? Creating Grids is a fun Job. However, there are some secrets about JS grids that you might want to know. Usually, we fill grids with a lot of data. Therefore, how efficiently the grid loads, the data is important when embedding a grid view to your websites. Suppose the grid takes 2-3 minutes to load the complete data set. However, using React grid view, creating a grid with only a millisecond of load time from scratch can be tedious, especially if you are under a tight schedule. So this can be something you could miss when building your grids with React grid view, which can lead to your project failure. You need a better Grid view that handles efficient data loading on your behalf. Then you never need to worry about that.  Does Your React Grid View Fail To Provide All The Functionalities Your Customers Expect? Not only the efficiency but also the functionalities offered by your data grid matters most for your project’s success. Suppose your grid can only provide basic functionalities like sorting and searching but cannot provide more intuitive features like pagination and infinite scrolling. In that case, there is a possibility that it cannot extend to provide more advanced features. With time, customer requirements can also be changed. Therefore, your react grid view can fail with time if it is not easily adjustable and needs several plugins to provide additional functionalities. If you want to see examples of successful JS grids, this article might help you.  Is Your React Grid View Cannot Be Customized Easily? Customizations are important when you work with any web component. Suppose your React grid view needs to be used on another page with some customizations. Can you easily achieve it with minimal impact on your code? If not, you will have to do additional work to support a customized grid whenever you want something different. Therefore, it is better to avoid such implementations at all costs and look for a better solution that enables you to do customizations without hassle. This is where third-party Javascript frameworks like Sencha can help you.  Is Your React Grid View Failing To Handle Your Growing Data Set? Data are bound to increase with time. If you have only thousands of data at your hand, you could have millions of data within the next couple of months or years. Therefore, your React good must also accommodate this growing data without affecting the loading speed. Your React grid can fail if you cannot easily improve the functionalities with […]

Read More

Digital Twin Twitter takeovers: May recap

Lauren and Sam have a comprehensive skill set in strategy, design, and technology, and their extended reality (XR) studio, RefractAR, specializes in spatial activations. These two innovators created a whole car maintenance app with Unity MARS. 1. If you’re crunched for time, use image trackers for AR 2. Polycam makes it easy to scan and create digital twins 3. How to create your AR experience with Unity MARS Follow Lauren Follow Sam

Read More

World Oceans Day: RT3D projects make waves and encourage conservation

Healthy oceans are essential for the survival of all life on Earth, so we need to protect them. We’re committed to ocean conservation as part of our ESG (environmental, social, and governance) efforts to build a more sustainable future and invest in our planet. Here are some exciting projects using Unity to celebrate the planet’s oceans, educate audiences, and encourage action: An Otter Planet by Habithéque is an in-progress PC game designed to teach players about water and help them understand its importance to all life on earth. In addition to raising awareness through play, An Otter Planet will raise money for charities to support water-related protection and revitalization efforts through in-game purchases and charitable donations. Raft, a PC game developed by Redbeet Interactive, highlights the incredible vastness of the open ocean. Players wake up adrift on a raft and then fight for survival by crafting, growing food, and avoiding shark attacks. Experiencing this game provides a new appreciation for the danger, stillness, and mystery of the oceans. The Hydrous is an innovative project that designs science-based augmented and virtual reality experiences to engage audiences with the wonders of ocean life. The creators’ goal is to provide “equitable access to ocean exploration,” which in turn builds understanding of beautiful and threatened marine ecosystems. — We believe that the world is a better place with more creators in it, and we’re excited to see the inspiring work being done to realize a sustainable, inclusive, and equitable world for all. Want to hear more inspiring creator stories? Sign up for Unity’s Social Impact newsletter for regular news and updates about our Social Impact work.

Read More

Why You Should Know About Machine Learning and Artificial Intelligence

It is undeniable that technology is rapidly evolving. Those things that are once a concept are now being materialized. We are currently embracing a new digital age where artificial intelligence is no longer a product of various science fiction novels but a real-life technology. In this video, Jim McKeeth is joined by Embarcadero MVP Yilmaz Yoru to tackle everything about Machine Learning and Artificial Intelligence. We will learn how this technology evolved over time, the ide software, programming languages, and libraries that are good for AI and the future of this technology. Things you need to know about artificial intelligence and machine learning Generally, Artificial Intelligence refers to the intelligence exhibited by machines capable of carrying out tasks that usually require human intelligence. It refers to the simulation of human intelligence in machines that are programmed to think like humans and mimic their actions. The term may also be applied to any machine that exhibits traits associated with a human mind such as learning and problem-solving. Some of these mental capabilities and functions may also refer to as Artificial General Intelligence or better known as the Strong AI. Machine Learning, on the other hand, is a subset of AI and uses algorithms to learn from data, find patterns in data and make predictions about future events or outcomes. Today, Artificial Intelligence can be applied to many things like chatbots, virtual assistants, autonomous cars, and more. When it comes to Machine Learning and AI development, the first thing you must consider is to pick the right programming language depending on what kind of machine or software you are building. In this video, we will get a list of ideal programming languages that work well with AI and Machine Learning. Some of which include Delphi, C++, C++ Builder, Python, and Java to name a few. We will also learn about different libraries and resources you can use for AI Software development. This includes TensorFlow, OpenCV, Mitov Software Intelligence Lab, and more. Jim McKeeth will also provide demos showing the aforementioned libraries in action using Delphi. The video will also discuss AI Ethics, AI Singularity, Movies and Programs that use AI and Machine learning as main subjects, as well as the things we could expect from these technologies in the future. To learn more about Artificial Intelligence and Machine Learning, feel free to watch the webinar below.

Read More