u/casually-anya
Ai tools are not creative
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/siqniz
Good reason to not document anything....
u/tdellaringa
Because they are parlor tricks.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/cm0011
There is HCI research working on it, that’s for sure
u/cm0011
There is HCI research working on it, that’s for sure
u/siqniz
Good reason to not document anything....
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/tdellaringa
Because they are parlor tricks.
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/casually-anya
Ai tools are not creative
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/tdellaringa
Because they are parlor tricks.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/casually-anya
Ai tools are not creative
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/casually-anya
Ai tools are not creative
u/casually-anya
Ai tools are not creative
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/siqniz
Good reason to not document anything....
u/tdellaringa
Because they are parlor tricks.
u/cm0011
There is HCI research working on it, that’s for sure
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/siqniz
Good reason to not document anything....
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/cm0011
There is HCI research working on it, that’s for sure
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/tdellaringa
Because they are parlor tricks.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/cm0011
There is HCI research working on it, that’s for sure
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/tdellaringa
Because they are parlor tricks.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/cm0011
There is HCI research working on it, that’s for sure
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/casually-anya
Ai tools are not creative
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/cm0011
There is HCI research working on it, that’s for sure
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/cm0011
There is HCI research working on it, that’s for sure
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/RhinoOnATrain
All I see is a win for trained UX designers
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/cm0011
There is HCI research working on it, that’s for sure
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/casually-anya
Ai tools are not creative
u/casually-anya
Ai tools are not creative
u/cm0011
There is HCI research working on it, that’s for sure
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/cm0011
There is HCI research working on it, that’s for sure
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/siqniz
Good reason to not document anything....
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/tdellaringa
Because they are parlor tricks.
u/siqniz
Good reason to not document anything....
u/tdellaringa
Because they are parlor tricks.
u/siqniz
Good reason to not document anything....
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/casually-anya
Ai tools are not creative
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/cm0011
There is HCI research working on it, that’s for sure
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/casually-anya
Ai tools are not creative
u/siqniz
Good reason to not document anything....
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/cm0011
There is HCI research working on it, that’s for sure
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/cm0011
There is HCI research working on it, that’s for sure
u/tdellaringa
Because they are parlor tricks.
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/casually-anya
Ai tools are not creative
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/casually-anya
Ai tools are not creative
u/cm0011
There is HCI research working on it, that’s for sure
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/tdellaringa
Because they are parlor tricks.
u/cm0011
There is HCI research working on it, that’s for sure
u/cm0011
There is HCI research working on it, that’s for sure
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/siqniz
Good reason to not document anything....
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/siqniz
Good reason to not document anything....
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/siqniz
Good reason to not document anything....
u/RhinoOnATrain
All I see is a win for trained UX designers
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/siqniz
Good reason to not document anything....
u/RhinoOnATrain
All I see is a win for trained UX designers
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/tdellaringa
Because they are parlor tricks.
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/tdellaringa
Because they are parlor tricks.
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/casually-anya
Ai tools are not creative
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/casually-anya
Ai tools are not creative
u/tdellaringa
Because they are parlor tricks.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/siqniz
Good reason to not document anything....
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/cm0011
There is HCI research working on it, that’s for sure
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/cm0011
There is HCI research working on it, that’s for sure
u/tdellaringa
Because they are parlor tricks.
u/siqniz
Good reason to not document anything....
u/casually-anya
Ai tools are not creative
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/tdellaringa
Because they are parlor tricks.
u/casually-anya
Ai tools are not creative
u/SI-1977
I tried Miro giving a straightforward prompt, and I received essential screens for SaaS, which looks very solid as a starting point to complete them.
I'm not able to see if I can generate
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/siqniz
Good reason to not document anything....
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/siqniz
Good reason to not document anything....
u/RhinoOnATrain
All I see is a win for trained UX designers
u/casually-anya
Ai tools are not creative
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/siqniz
Good reason to not document anything....
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/cm0011
There is HCI research working on it, that’s for sure
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/RhinoOnATrain
All I see is a win for trained UX designers
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/tdellaringa
Because they are parlor tricks.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/casually-anya
Ai tools are not creative
u/casually-anya
Ai tools are not creative
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/SI-1977
I tried Miro giving a straightforward prompt, and I received essential screens for SaaS, which looks very solid as a starting point to complete them.
I'm not able to see if I can generate
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/siqniz
Good reason to not document anything....
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/siqniz
Good reason to not document anything....
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/tdellaringa
Because they are parlor tricks.
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/tdellaringa
Because they are parlor tricks.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/casually-anya
Ai tools are not creative
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/siqniz
Good reason to not document anything....
u/tdellaringa
Because they are parlor tricks.
u/casually-anya
Ai tools are not creative
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/tdellaringa
Because they are parlor tricks.
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/cm0011
There is HCI research working on it, that’s for sure
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/cm0011
There is HCI research working on it, that’s for sure
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/siqniz
Good reason to not document anything....
u/siqniz
Good reason to not document anything....
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/tdellaringa
Because they are parlor tricks.
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/siqniz
Good reason to not document anything....
u/cm0011
There is HCI research working on it, that’s for sure
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/tdellaringa
Because they are parlor tricks.
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/siqniz
Good reason to not document anything....
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/RhinoOnATrain
All I see is a win for trained UX designers
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/cm0011
There is HCI research working on it, that’s for sure
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/casually-anya
Ai tools are not creative
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/siqniz
Good reason to not document anything....
u/RhinoOnATrain
All I see is a win for trained UX designers
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/casually-anya
Ai tools are not creative
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/cm0011
There is HCI research working on it, that’s for sure
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Ruskerdoo
Transformer models are generally only trained on complete work. Articles, illustrations, photographs, novels, songs. They only know how to replicate the finished product. They can do lone wir
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/casually-anya
Ai tools are not creative
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/wardrox
Are you explicitly asking for these to be shown? Sometimes the AI tools need the task broken down into more detail, and told to plan it out before embarking.
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/cm0011
There is HCI research working on it, that’s for sure
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/casually-anya
Ai tools are not creative
u/mdutton27
No offence but you sound like a user experience person looking to outsource your responsibilities to AI.
Also this IS possible but you need to learn the tools to do it successfully and if
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/IniNew
They're probably a non-ux person trying to outsource UX work to AI so it's cheaper.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/olivicmic
Why would AI do any of this? AI doesn’t understand context. You need to understand context to do design. Stop trying to find shortcuts.
u/tdellaringa
Because they are parlor tricks.
u/RhinoOnATrain
All I see is a win for trained UX designers
u/NestorSpankhno
If you don’t want to do your job, quit. There are plenty of designers out there who actually want to design who will take your place.
u/ak_sha
AI tools Can help us in mundane tasks, for creative and empathy purposes we can’t rely upon these tools !
u/Levenloos
Could it be that these tools are likely trained on single screenshots with no annotations?
u/Ramosisend
I had the same issue, most AI tools give you one-off screens and then you have to manually stitch everything together again in Dogma or dev tools
I've been using UX Pilot recently and what I
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/KoalaFiftyFour
I haven't found anything that completely nails it yet, but some tools are starting to look at the workflow side more. I've seen some stuff from Magic Patterns that seems to be trying to link
u/Momoware
Same reason that they are not good at structured data queries (SQL) without human supervision. LLMs have had a precision/accuracy problem since Day 1.
u/Skar_ara
AI fundamentally solves one problem of identifying known components and providing output in a structured format based on its data that is trained on certain patterns.
A screen is just one co
u/Tokail
I’m building this exact solution at the moment, on top of a UX AI tool I’ve been running for 2 years, with more than 80k users.
The challenge I’m facing is, let’s take coding for example, it
u/siqniz
Good reason to not document anything....
u/tdellaringa
Because they are parlor tricks.
u/sampleminded
So there are a few AI UX tools that do this well. Magic patterns has a canvas so I can have a bunch of screens next to each other. Subframe is going to build a canvas as well. Anima has a flo