{"id":13161,"date":"2025-03-13T18:43:07","date_gmt":"2025-03-13T18:43:07","guid":{"rendered":"https:\/\/comet-marketing-site.lndo.site\/?page_id=13161"},"modified":"2025-11-03T20:24:24","modified_gmt":"2025-11-03T20:24:24","slug":"pentoai","status":"publish","type":"page","link":"https:\/\/www.comet.com\/site\/customers\/pentoai\/","title":{"rendered":"Using Comet Panels for Computer Vision at Pento.ai"},"content":{"rendered":"\n<figure class=\"wp-block-image aligncenter\"><img loading=\"lazy\" decoding=\"async\" width=\"1536\" height=\"823\" src=\"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.35.07-PM-1536x823-1.png\" alt=\"\" class=\"wp-image-1335\" srcset=\"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.35.07-PM-1536x823-1.png 1536w, https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.35.07-PM-1536x823-1-300x161.png 300w, https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.35.07-PM-1536x823-1-1024x549.png 1024w, https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.35.07-PM-1536x823-1-768x412.png 768w\" sizes=\"auto, (max-width: 1536px) 100vw, 1536px\" \/><\/figure>\n\n\n\n<p><em>Intro by Niko Laskaris<\/em><\/p>\n\n\n\n<p>We released a code-based custom visualization builder called Custom Panels. As part of the rollout, we\u2019re featuring user stories from some of the awesome researchers using Comet as part of their research and development toolkit. One of these teams,&nbsp;<a href=\"http:\/\/pento.ai\/\">Pento.ai<\/a>, are long-time Comet users and were part of the beta test group for Custom Panels. Pento is a top machine learning consulting firm working with some of the biggest companies in the world. We were excited to see what they\u2019d come up with given the freedom to build any visualization they wanted, and we weren\u2019t disappointed.<\/p>\n\n\n\n<h2 class=\"wp-block-heading\"><\/h2>\n\n\n\n<h2 class=\"wp-block-heading\"><strong>Comet Custom Panels<\/strong>&nbsp;at Pento.ai<\/h2>\n\n\n\n<p><em>Written by&nbsp;<a href=\"https:\/\/www.linkedin.com\/in\/psoto23\/\">Pablo Soto<\/a>&nbsp;and&nbsp;<a href=\"http:\/\/www.linkedin.com\/in\/agustin-azzinnari\">Agustin Azzinnari<\/a><\/em><\/p>\n\n\n\n<p>Pento is a company specializing in building software solutions that leverage the power of machine learning. By incorporating data and learning into our clients\u2019 processes, we help them make optimal decisions or even automate entire processes.<\/p>\n\n\n\n<p>There are many consulting companies out there all with a similar offering so, since the beginning, we\u2019ve tried to take a novel approach. Pento is composed of a group of partners that have ample experience in the industry, delivering solutions with real and measurable results.<\/p>\n\n\n\n<p>Having this autonomy allows us to cover a lot of ground while being highly specialized: we are partners with a proven track record in computer vision, in predictive analytics, as well as natural language processing. We\u2019ve also been able to let this focus spill over to the open source community we leverage so much from, by contributing with tools such as our human perception library,&nbsp;<a href=\"https:\/\/github.com\/pento-group\/terran\">Terran<\/a>.<\/p>\n\n\n\n<p>In today\u2019s job market, machine learning development is a highly sought-after skill. Many companies are popping up left and right to fill this gap left by the new advances in the area. However, not every ML project goes according to plan. There are many, many aspects one needs to balance at the same time.<\/p>\n\n\n\n<p>As such, ML engineers need to make use of all the tools at hand to ensure this process goes smoothly. We\u2019ve found Comet, in particular, to be one of the tools that have permanently found a place in our toolbox.<\/p>\n\n\n\n<p>Being organized and clear when delivering results is one of the key points to a successful ML initiative. If the series of steps one takes in order to make an automated decision isn\u2019t clear, or simply if the client doesn\u2019t understand the results presented, the project is bound to fail.<\/p>\n\n\n\n<p>Due to this, it\u2019s crucial to keep track of all your experiments, to understand and preserve a record of all of your research over the course of a project. If these experiments take hours or even days to run, we\u2019ll forget why we ran them in the first place. Here\u2019s where we\u2019ve found tools such as Comet to be extremely useful: centralize data for all experiments, attach all the metadata we need to them, visualize intermediate results, and be able to quickly reproduce them.<\/p>\n\n\n\n<p>Working with one client after another, we end up re-using the same visualizations and analytical tools over and over, leveraging the experience acquired in one project for the next one. Given that machine learning is an incremental process, we are constantly tweaking our code and systems and carrying them over to our next project. The Custom Panels feature in Comet is a step towards perfecting that re-usability.<\/p>\n\n\n\n<p>Good visualizations are key to any successful ML project, but this is especially true for Computer Vision (CV) projects. In the following section, we will present a simple CV project, and explore how we can use Comet to improve our experimentation process. We\u2019ll be using our&nbsp;<a href=\"https:\/\/github.com\/pento-group\/terran\">open-source human perception library<\/a>, called Terran, in order to illustrate the process we\u2019d normally do in a real project.<\/p>\n\n\n\n<p><strong>What\u2019s Terran?<\/strong><\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"768\" height=\"241\" src=\"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.52.59-PM-768x241-1.png\" alt=\"\" class=\"wp-image-1333\" srcset=\"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.52.59-PM-768x241-1.png 768w, https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.52.59-PM-768x241-1-300x94.png 300w\" sizes=\"auto, (max-width: 768px) 100vw, 768px\" \/><\/figure>\n\n\n\n<p>Terran is a human perception library that provides computer vision techniques and algorithms in order to facilitate building systems that interact with people. Whether it\u2019s recognizing somebody by their face, or detecting when they raise their hand, Terran provides the primitives necessary to act in response to a person.<\/p>\n\n\n\n<p><strong>Building a Panel<\/strong><\/p>\n\n\n\n<p>The example project will consist of a simple program that detects faces in a video. In order to do this, we\u2019ll be using three key functionalities from Terran: face detection, recognition and tracking. We\u2019ll implement a custom visualization that\u2019ll help us understand how our program is performing. A common technique to do this is to plot the face embeddings generated by Terran as points on a plane to see if these points are reasonably structured.<\/p>\n\n\n\n<p>For instance, we might train an image classifier, embed these internal representations into a 2-dimensional space, and check that images of the same classes are embedded in nearby regions of this resulting space. If this assumption doesn\u2019t hold, it\u2019s possible our samples are under-represented or even mislabeled, or that there\u2019s an issue with our classifier, so it\u2019s a good thing to check every once in a while.<\/p>\n\n\n\n<p>In our example, we are going to generate the proposed embedding visualizations using the results of our face detection and recognition, and then make sure that faces corresponding to the same person are indeed placed nearby. Here is the video we\u2019ll be using:<\/p>\n\n\n\n<p>First, let\u2019s go with the traditional, fully Python, approach:<\/p>\n\n\n\n<p>We perform face detection and feature extraction on each frame of the video using Terran, which is as simple as using the face_tracking and extract_features functions. By feature extraction we mean retrieving a 1024-dimensional representation for each face, where faces that are similar (and thus probably correspond to the same person) have a small cosine distance between them.<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">from terran.io import open_video\nfrom terran.face import Detection, extract_features\nfrom terran.tracking import face_tracking<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">video = open_video(\n    'https:\/\/www.youtube.com\/watch?v=ggFKLxAQBbc',\n    batch_size=64\n)<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">tracker = face_tracking(\n    video=video,\n    detector=Detection(short_side=832),\n)<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">faces = []\n features = []\n for frames in video:\n     faces_per_frame = tracker(frames)\n     features_per_frame = extract_features(frames, faces_per_frame)<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">for frame, frame_faces, frame_features in zip(\n    frames, faces_per_frame, features_per_frame\n):\n\n    for face, feature in zip(frame_faces, frame_features):\n     face_crop = crop_expanded_pad(\n            frame, face['bbox'], factor=0.0\n        )\n\n        faces.append(face_crop)\n        features.append(feature)<\/pre>\n\n\n\n<p>Perform dimensionality reduction over these representations by using t-SNE:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">from sklearn.manifold import TSNE\n\nreduced = TSNE(\n    n_components=2,\nperplexity=20.0,\nn_iter=5000,\nmetric='cosine',\nn_jobs=-1\n).fit_transform(embeddings)<\/pre>\n\n\n\n<p>And finally, use the TSNE results to get closest neighbors to each points:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">from scipy.spatial.distance import cdist<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">k = 5\nd = cdist(reduced, reduced)\nneighbors = d.argsort(axis=1)[:, 1: k+1]<\/pre>\n\n\n\n<p>Even though Python visualizations can be helpful, they are still static, which makes it difficult to navigate through the results without re-running it each time with different values. Even more, it\u2019s really hard to keep track of the visualizations for each experiment, especially if we have to manually generate them on every run. All of this makes the experimentation process slow and error-prone.<\/p>\n\n\n\n<p>Fortunately, Comet has found a solution for this with Comet Panels. By providing a flexible interface they allow us to create custom interactive visualizations that integrate seamlessly with the rest of Comet\u2019s features, such as asset logging and experiment comparisons.<\/p>\n\n\n\n<p>Extending our example above to use Panels is simple. The following diagram shows how everything fits together. So far we have implemented the blue and green boxes. Now we need to implement the Comet integration (orange).<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"300\" height=\"251\" src=\"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.53.12-PM-300x251-1.png\" alt=\"\" class=\"wp-image-1337\"\/><\/figure>\n\n\n\n<p>The first thing we need to do is to upload all the data required by the visualizations. In our case, it means uploading the face crops from step 2 and the face embeddings from step 4 (that is, the 2-dimensional embeddings, so we use up less storage).<\/p>\n\n\n\n<p>We\u2019ll do this from within our Python training code, following the usual steps we go through when using Comet:<\/p>\n\n\n\n<p>1. We first create the experiment:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">experiment = Experiment(\u2018API-KEY\u2019)<\/pre>\n\n\n\n<p>2. Then log the necessary assets. First the face crops:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">for face_name, face in enumerate(faces):\nexperiment.log_image(face, name=face_name)<\/pre>\n\n\n\n<p>3. Then the embeddings and the pre-calculated neighbors data<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">data = dict(<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">&nbsp;&nbsp;&nbsp;&nbsp;x=reduced[:, 0].tolist(),<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">&nbsp;&nbsp;&nbsp;&nbsp;y=reduced[:, 1].tolist(),<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">&nbsp;&nbsp;&nbsp;&nbsp;faces=[f'#{face_id}' for face_id in range(len(faces))],<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">&nbsp;&nbsp;&nbsp;&nbsp;neighbors=neighbors.tolist()<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">)\n\nexperiment.log_asset_data(data, name='tsne.json')<\/pre>\n\n\n\n<p>Now that we have the data available in Comet, we need to build the Panel. The basic interface to be implemented by our Panel is the following:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">class MyPanel extends Comet.Panel {\n&nbsp; setup() {\n \/\/ Configuration.\n&nbsp; }<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">&nbsp;&nbsp;draw(experimentKeys, projectId) {\n\t\/\/ Select experiment.\n\tthis.drawOne(selectedExperimentKey);\n&nbsp; }<\/pre>\n\n\n\n<pre class=\"wp-block-preformatted\">&nbsp;&nbsp;drawOne(experimentKey) {\n\t\/\/ Create and initialize the chart.\n&nbsp; }\n}<\/pre>\n\n\n\n<p>The&nbsp;<code>setup<\/code>&nbsp;method is a good place to define all the configurations for your panel. In our case we defined some options for our Plotly chart:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">setup() {\n&nbsp; this.options = {\n\tlayout: {\n\t&nbsp; \tshowlegend: false,\n&nbsp; \t\tlegend: {\n\t&nbsp; &nbsp; \t\torientation: \"h\"\n\t&nbsp; \t},\n&nbsp; \t\ttitle: {\n\t&nbsp; &nbsp; \t\ttext: \"Embeddings TSNE\"\n\t&nbsp; \t}\n\t}\n&nbsp; };\n}<\/pre>\n\n\n\n<p>Our Panel only works to visualize data from a single experiment, therefore we need to apply the approach described&nbsp;<a href=\"https:\/\/www.comet.com\/docs\/javascript-sdk\/getting-started\/#single-experiment-workaround\">here<\/a>. For the&nbsp;<code>draw<\/code>&nbsp;method, we select the experiment we want to explore, while in the&nbsp;<code>drawOne<\/code>&nbsp;method, we create the actual plot. In order to build the plot, we need to fetch the data we uploaded to Comet by using the Javascript SDK:<\/p>\n\n\n\n<pre class=\"wp-block-preformatted\">drawOne(experimentKey) {\n&nbsp; \/\/ Instantiate Plotly chart.\n&nbsp; \/\/ Fetch face images.\n&nbsp; this.api.experimentImages(experimentKey)\n\t.then(images =&gt; {\n&nbsp; \t\t\/\/ Fetch tSNE coordinates and nearest neighbors data.\n&nbsp; \t\tthis.api.experimentAssetByName(experimentKey, \"tsne.json\")\n\t&nbsp; &nbsp; \t.then(result =&gt; {\n&nbsp; &nbsp; &nbsp; \t\t\/\/ Draw points in chart.\n&nbsp; &nbsp; \t\t});\n\t});\n}<\/pre>\n\n\n\n<p>Once we have the data we need in the panel, we can make use of all the Javascript, HTML and CSS ecosystem to create our custom visualization.<\/p>\n\n\n\n<figure class=\"wp-block-image\"><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"640\" src=\"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.53.23-PM-1024x640-1.png\" alt=\"\" class=\"wp-image-1336\" srcset=\"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.53.23-PM-1024x640-1.png 1024w, https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.53.23-PM-1024x640-1-300x188.png 300w, https:\/\/www.comet.com\/site\/wp-content\/uploads\/2022\/06\/Screen-Shot-2020-08-06-at-12.53.23-PM-1024x640-1-768x480.png 768w\" sizes=\"auto, (max-width: 1024px) 100vw, 1024px\" \/><\/figure>\n\n\n\n<p>To view our custom panel, check out our public&nbsp;<a href=\"https:\/\/www.comet.com\/pento\/terran-face-detection\/view\/QNeUChV04YrxjZNgsxyWNXgyP\">Comet project here.<\/a><\/p>\n\n\n\n<p><strong>Conclusion<\/strong><\/p>\n\n\n\n<p>Machine learning is a new addition to software engineering. A whole new set of difficulties and possibilities arise. However, just like the software industry was still searching for ways to tackle projects in a more principled manner at the end of the last century, so as to make the whole process less uncertain, it is now searching for better ways to incorporate ML into the development process.<\/p>\n\n\n\n<p>Part of this evolution is done by making practices more robust and reproducible, and tools such as CometML are contributing towards that goal. As practitioners, we very much welcome such initiatives.<\/p>\n\n\n\n<p><\/p>\n","protected":false},"excerpt":{"rendered":"<p>Intro by Niko Laskaris We released a code-based custom visualization builder called Custom Panels. As part of the rollout, we\u2019re featuring user stories from some of the awesome researchers using Comet as part of their research and development toolkit. One of these teams,&nbsp;Pento.ai, are long-time Comet users and were part of the beta test group [&hellip;]<\/p>\n","protected":false},"author":140,"featured_media":18122,"parent":488,"menu_order":0,"comment_status":"closed","ping_status":"closed","template":"case-study","meta":{"customer_name":"Pento.ai","customer_description":"ML consulting: Visual analysis, image manipulation, and video analytics","customer_industry":"Technology - AI & ML Development","customer_technologies":"Terran (human perception library)","customer_logo":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/10\/pento-ai-logo-white.svg","footnotes":""},"coauthors":[127],"class_list":["post-13161","page","type-page","status-publish","has-post-thumbnail","hentry"],"yoast_head":"<!-- This site is optimized with the Yoast SEO Premium plugin v25.9 (Yoast SEO v25.9) - https:\/\/yoast.com\/wordpress\/plugins\/seo\/ -->\n<title>Using Comet Panels for Computer Vision at Pento.ai - Comet<\/title>\n<meta name=\"description\" content=\"Learn how ML consulting firm Pento.ai uses Comet panels for computer vision. Code-based custom visualization builder, Custom Panels, is a crucial tool.\" \/>\n<meta name=\"robots\" content=\"index, follow, max-snippet:-1, max-image-preview:large, max-video-preview:-1\" \/>\n<link rel=\"canonical\" href=\"https:\/\/www.comet.com\/site\/customers\/pentoai\/\" \/>\n<meta property=\"og:locale\" content=\"en_US\" \/>\n<meta property=\"og:type\" content=\"article\" \/>\n<meta property=\"og:title\" content=\"Using Comet Panels for Computer Vision at Pento.ai\" \/>\n<meta property=\"og:description\" content=\"Learn how ML consulting firm Pento.ai uses Comet panels for computer vision. Code-based custom visualization builder, Custom Panels, is a crucial tool.\" \/>\n<meta property=\"og:url\" content=\"https:\/\/www.comet.com\/site\/customers\/pentoai\/\" \/>\n<meta property=\"og:site_name\" content=\"Comet\" \/>\n<meta property=\"article:publisher\" content=\"https:\/\/www.facebook.com\/cometdotml\" \/>\n<meta property=\"article:modified_time\" content=\"2025-11-03T20:24:24+00:00\" \/>\n<meta property=\"og:image\" content=\"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/10\/Case-study-Pento.webp\" \/>\n\t<meta property=\"og:image:width\" content=\"1842\" \/>\n\t<meta property=\"og:image:height\" content=\"650\" \/>\n\t<meta property=\"og:image:type\" content=\"image\/webp\" \/>\n<meta name=\"twitter:card\" content=\"summary_large_image\" \/>\n<meta name=\"twitter:site\" content=\"@Cometml\" \/>\n<meta name=\"twitter:label1\" content=\"Est. reading time\" \/>\n\t<meta name=\"twitter:data1\" content=\"8 minutes\" \/>\n\t<meta name=\"twitter:label2\" content=\"Written by\" \/>\n\t<meta name=\"twitter:data2\" content=\"Caroline Borders\" \/>\n<!-- \/ Yoast SEO Premium plugin. -->","yoast_head_json":{"title":"Using Comet Panels for Computer Vision at Pento.ai - Comet","description":"Learn how ML consulting firm Pento.ai uses Comet panels for computer vision. Code-based custom visualization builder, Custom Panels, is a crucial tool.","robots":{"index":"index","follow":"follow","max-snippet":"max-snippet:-1","max-image-preview":"max-image-preview:large","max-video-preview":"max-video-preview:-1"},"canonical":"https:\/\/www.comet.com\/site\/customers\/pentoai\/","og_locale":"en_US","og_type":"article","og_title":"Using Comet Panels for Computer Vision at Pento.ai","og_description":"Learn how ML consulting firm Pento.ai uses Comet panels for computer vision. Code-based custom visualization builder, Custom Panels, is a crucial tool.","og_url":"https:\/\/www.comet.com\/site\/customers\/pentoai\/","og_site_name":"Comet","article_publisher":"https:\/\/www.facebook.com\/cometdotml","article_modified_time":"2025-11-03T20:24:24+00:00","og_image":[{"width":1842,"height":650,"url":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/10\/Case-study-Pento.webp","type":"image\/webp"}],"twitter_card":"summary_large_image","twitter_site":"@Cometml","twitter_misc":{"Est. reading time":"8 minutes","Written by":"Caroline Borders"},"schema":{"@context":"https:\/\/schema.org","@graph":[{"@type":"WebPage","@id":"https:\/\/www.comet.com\/site\/customers\/pentoai\/","url":"https:\/\/www.comet.com\/site\/customers\/pentoai\/","name":"Using Comet Panels for Computer Vision at Pento.ai - Comet","isPartOf":{"@id":"https:\/\/www.comet.com\/site\/#website"},"primaryImageOfPage":{"@id":"https:\/\/www.comet.com\/site\/customers\/pentoai\/#primaryimage"},"image":{"@id":"https:\/\/www.comet.com\/site\/customers\/pentoai\/#primaryimage"},"thumbnailUrl":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/10\/Case-study-Pento.webp","datePublished":"2025-03-13T18:43:07+00:00","dateModified":"2025-11-03T20:24:24+00:00","description":"Learn how ML consulting firm Pento.ai uses Comet panels for computer vision. Code-based custom visualization builder, Custom Panels, is a crucial tool.","breadcrumb":{"@id":"https:\/\/www.comet.com\/site\/customers\/pentoai\/#breadcrumb"},"inLanguage":"en-US","potentialAction":[{"@type":"ReadAction","target":["https:\/\/www.comet.com\/site\/customers\/pentoai\/"]}]},{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.comet.com\/site\/customers\/pentoai\/#primaryimage","url":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/10\/Case-study-Pento.webp","contentUrl":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/10\/Case-study-Pento.webp","width":1842,"height":650,"caption":"Comet x Pento AI"},{"@type":"BreadcrumbList","@id":"https:\/\/www.comet.com\/site\/customers\/pentoai\/#breadcrumb","itemListElement":[{"@type":"ListItem","position":1,"name":"Home","item":"https:\/\/www.comet.com\/site\/"},{"@type":"ListItem","position":2,"name":"Customers","item":"https:\/\/www.comet.com\/site\/customers\/"},{"@type":"ListItem","position":3,"name":"Using Comet Panels for Computer Vision at Pento.ai"}]},{"@type":"WebSite","@id":"https:\/\/www.comet.com\/site\/#website","url":"https:\/\/www.comet.com\/site\/","name":"Comet","description":"Build Better Models Faster","publisher":{"@id":"https:\/\/www.comet.com\/site\/#organization"},"potentialAction":[{"@type":"SearchAction","target":{"@type":"EntryPoint","urlTemplate":"https:\/\/www.comet.com\/site\/?s={search_term_string}"},"query-input":{"@type":"PropertyValueSpecification","valueRequired":true,"valueName":"search_term_string"}}],"inLanguage":"en-US"},{"@type":"Organization","@id":"https:\/\/www.comet.com\/site\/#organization","name":"Comet ML, Inc.","alternateName":"Comet","url":"https:\/\/www.comet.com\/site\/","logo":{"@type":"ImageObject","inLanguage":"en-US","@id":"https:\/\/www.comet.com\/site\/#\/schema\/logo\/image\/","url":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/01\/logo_comet_square.png","contentUrl":"https:\/\/www.comet.com\/site\/wp-content\/uploads\/2025\/01\/logo_comet_square.png","width":310,"height":310,"caption":"Comet ML, Inc."},"image":{"@id":"https:\/\/www.comet.com\/site\/#\/schema\/logo\/image\/"},"sameAs":["https:\/\/www.facebook.com\/cometdotml","https:\/\/x.com\/Cometml","https:\/\/www.youtube.com\/channel\/UCmN63HKvfXSCS-UwVwmK8Hw"]}]}},"_links":{"self":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/pages\/13161","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/pages"}],"about":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/types\/page"}],"author":[{"embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/users\/140"}],"replies":[{"embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/comments?post=13161"}],"version-history":[{"count":1,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/pages\/13161\/revisions"}],"predecessor-version":[{"id":13163,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/pages\/13161\/revisions\/13163"}],"up":[{"embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/pages\/488"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/media\/18122"}],"wp:attachment":[{"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/media?parent=13161"}],"wp:term":[{"taxonomy":"author","embeddable":true,"href":"https:\/\/www.comet.com\/site\/wp-json\/wp\/v2\/coauthors?post=13161"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}