PodSearch

Connected

101: Best Represent Pineapple

 

00:00:00   [Music]

00:00:07   From Relay FM, this is Connected, episode 101. Today's show is brought to you by Text

00:00:13   Expander from Smile. My name is Myke Hurley. I'm joined by Mr. Steven Hackett. Hello,

00:00:17   Steven Hackett.

00:00:18   Hello, Michael Hurley.

00:00:20   And Federico Vittucci. Ciao, Federico.

00:00:22   Buona sera, Myke. Como estas?

00:00:24   Oh, so fancy.

00:00:25   I just asked you, how are you?

00:00:27   I know, well I don't know, all I know is that it sounds nice.

00:00:30   Just say Benny.

00:00:31   Benny.

00:00:32   Okay, thank you.

00:00:34   Steven, you can have the follow-up back this week.

00:00:38   That's good, because the first piece of follow-up is about how you are wrong.

00:00:42   Mmm.

00:00:43   You made a comment that you can make espresso in your AeroPress,

00:00:49   which, if you're not a coffee drinker, just hit the 30-second skip for like six times.

00:00:55   Jordan wrote in to say to meet the minimum press requirement for an Italian espresso

00:01:00   which is 116 PSI which seems bonkers to me you would need to put 460 pounds on the Aeropress

00:01:08   plunger.

00:01:09   So I would say in theory you could do it but you would break the Aeropress trying to trying

00:01:17   to make it happen so I will concede that Myke was not right about espresso.

00:01:23   See the thing is you don't want to say you were wrong, you're just gonna say you were

00:01:27   not right. That's a subtle difference.

00:01:31   Yeah I prefer was not right than wrong.

00:01:35   I bet you do.

00:01:36   Yeah I prefer was not right. Because as well that keeps it in the branding, you know? Myke

00:01:42   and right, just sometimes what's in the middle differs.

00:01:46   Speaking of branding, just a quick aside from the follow up. So anyway you were not right

00:01:51   about the coffee but speaking of branding I'm writing my review and there

00:01:56   was a section where I needed to have an example of an app and Myke was kind

00:02:00   enough to let me use the one two three branding. Yeah we are we are co-branding

00:02:06   a brand new creation. Mm-hmm co-branding a co-creation because we're co-creators

00:02:11   of an example I think. Wow. Thank you Myke. So the one two three family of

00:02:18   applications is growing. Will expand onto new horizons. New horizons and the

00:02:26   introduction of the new app and the 123 family will be appearing sometime in

00:02:32   September. Mm-hmm that's right. Probably before Notetaker is released I'm afraid.

00:02:37   Working hard but. Before we move on just to put the amount of weight on the

00:02:42   Aeropress plunger into context that you know normal people could relate to it's

00:02:47   It's 12 iMac G3s.

00:02:49   Just uh, I don't think the AirPress would hold up to that.

00:02:52   More people can relate to just the flat out 460 pounds of pressure than 12 iMac G3s.

00:02:59   Also that's not even your entire collection.

00:03:01   No, I have more than that.

00:03:04   I wouldn't use Blue Dalmatian for that.

00:03:06   You don't want that anywhere near your coffee.

00:03:09   Up next we talked about rich link previews in the iOS 10 messages application.

00:03:15   if I send Myke a text with a tweet on it like the tweet would kind of be loaded

00:03:19   in a little preview or if you send a Mac stories link the Mac stories logo would

00:03:24   be there. We talked about you know what sort of data load that would require, why

00:03:29   is messages still like reloading if you peek and pop then actually open it in

00:03:34   Safari. We had some anonymous feedback that I thought was interesting that the

00:03:39   preview that little preview window is using OpenGraph tags which are basically

00:03:44   meta tags on the page that specify the URL, image, title, description, etc. You see

00:03:50   these used in other places so if you put in, again we'll use Mac stories as an example,

00:03:55   if you paste in a Mac stories link to Facebook or Slack, the Mac stories artwork

00:03:59   loads for you automatically. That's an open graph tag pointing to a

00:04:03   PNG on Federico's server somewhere saying hey this is what the logo is, this is the

00:04:08   title, etc. And it seems that iMessage is using this to build these little

00:04:13   previews and where the feedback gets a little stranger I'm not quite sure how

00:04:18   this works Federico I'm curious if you do is that this still requires iMessage to

00:04:23   download the page then parse for the tags and then show you the preview so if

00:04:31   you text somebody a link and they see the preview and they they peek and pop

00:04:35   to get a preview and they push through to open it in Safari you've loaded the

00:04:39   the page potentially three times doesn't seem super efficient to me.

00:04:45   Federico have you been able to kind of like see what's going on here in your time with

00:04:49   the betas?

00:04:50   So I've been actually researching this since last year because Apple launched with iOS

00:04:55   9 with universal links and spotlight search.

00:04:58   They had a session and documentation for developers on how to add web markup to links and to search

00:05:05   results and basically Apple is adopting a few technologies here. It's not just

00:05:10   OpenGraph. OpenGraph is used for like featured images, titles, descriptions but

00:05:16   also they're supporting schema.org which is used for like structured semantic

00:05:20   results such as like restaurant reservations and hotel rooms that kind

00:05:23   of stuff. They're also supporting Facebook and Twitter cards which are

00:05:27   kind of based on a subset of OpenGraph tags. My understanding is

00:05:33   Apple was using OpenGraph and Schema.org for Spotlight and last year for Reach

00:05:41   link previews in Notes and this year they're also adopting OpenGraph in

00:05:45   iMessage. There's an inconsistency between the implementation in iMessage and

00:05:49   Notes in the current beta of iOS 10 so basically in iMessage you get richer

00:05:55   results. For example when you paste a link to a tweet in iMessage in the beta

00:05:59   3, not only do you get the text of the tweet, but you also get like a large image preview

00:06:06   of any image attachments to the tweet.

00:06:08   You do not get the image attachment in Notes.

00:06:11   My theory is that you're not required as an app to go fetch the entire HTML of the web

00:06:19   page, but you can just query the link for tags.

00:06:22   So it's not like you're downloading a 5 megabyte HTML page, you're just querying for those

00:06:28   tags and assembling the information from those tags so you can just fetch the

00:06:31   title, fetch the featured image and fetch say the description for example which is

00:06:36   still a data you know you're still consuming data so that's why maybe you're

00:06:41   you're required to accept the preview. I currently don't

00:06:46   understand that this is why I have a note in my in my draft I don't understand

00:06:49   if you need to confirm that you want to load previews the first time and then

00:06:54   after the first time it's on by default. I don't understand if there's any way to

00:06:58   disable them after the first time and I don't understand if actually it's a bug

00:07:02   and you need to confirm every time. So it's still... the behavior is changing, it

00:07:07   was different in beta 1, different in beta 2, different in beta 3, so it's still

00:07:10   changing. It seems like there's no setting to say I always want to load

00:07:15   those previews or let me decide, let the recipient decide. I feel like there's

00:07:20   going to be some clarification before the GM, at least that's what I hope. If

00:07:25   Apple wants to add an extra confirmation step I feel like a better solution would

00:07:29   be to let the recipient load the preview the first time and then after that

00:07:35   previews are always gonna be on by default, that seems like a reasonable way

00:07:40   to implement this because you know that you're gonna load, I don't know, like 500

00:07:45   kilobytes of featured image previews in your iMessage conversation on cellular

00:07:50   data, that could be, but I don't think you're required to download an entire webpage every

00:07:57   time, that would be kind of silly, so if anyone has any more details or, you know, I couldn't

00:08:03   find any documentation from Apple on this, so if you have links, send us an email, Steven

00:08:08   will include it in the follow-up.

00:08:10   My feeling is like, if you have to press a button to load a preview every time, it kind

00:08:16   of negates the point of having the preview in the first place.

00:08:20   Yeah, but you can see how Apple wants you to confirm that you want to consume a little

00:08:25   more data to load previews.

00:08:28   It's not even a lot of data, but I mean, when you're downloading, like, let's say that I

00:08:31   have a featured image and it's a 1MB JPEG for a story on the website, I can imagine

00:08:38   why Apple wants you to confirm that, you know?

00:08:41   Because maybe people are gonna share a lot of links and they're gonna, you know, end

00:08:46   up with 10 megabytes extra every day in iMessage that adapt over a month of usage.

00:08:52   I'm not sure how Messenger for example or WhatsApp handle this.

00:08:55   Maybe they do some server-side optimization to reduce the size of the images that they

00:09:01   load in reaching previews.

00:09:03   Maybe?

00:09:04   Could be?

00:09:05   I don't know.

00:09:06   Yeah, a little bit of a mystery.

00:09:08   I too tried looking in the documentation on the developer website and there's just nothing

00:09:13   there yet.

00:09:14   super uncommon I think sometimes people hear us say that and they're surprised

00:09:18   that not every single thing is documented and the reality is Apple's

00:09:22   documentation really focuses on where developers interact with the system and

00:09:26   not necessarily what the system is doing internally. It's more much more about

00:09:30   API's and endpoints and that sort of thing as opposed to this is what our

00:09:35   first party app is doing behind the scenes. That sort of stuff they really

00:09:38   just don't share very often.

00:09:39   Last but a follow-up is from our friend and yours Rob Lewis that the new TV OS

00:09:46   beta allows you to connect a PlayStation controller. Myke I'm gonna let you talk about this.

00:09:54   This is so strange. This goes to a point link and it's shown to a Reddit thread that kind of

00:09:58   explains what's going on here. If you put a PlayStation controller into pairing

00:10:03   mode, so Bluetooth pairing mode, as you would to connect it to another like

00:10:08   PlayStation or to another device like a Mac or something because you can do that

00:10:12   with PlayStation controllers. If you go to settings and go to the pairing mode on

00:10:17   tvOS it shows up as wireless controller. It doesn't seem like as of right now it

00:10:24   is completely implemented. For example there is no button that maps to the menu

00:10:30   button in tvOS to allow you to navigate back through menus and stuff but it's

00:10:35   working and I feel like it can't be an accident because it hasn't worked before.

00:10:43   I don't know, because I don't really understand the way this stuff is implemented enough I

00:10:48   would say but it seems strange that it would just magically start to work now.

00:10:53   I hope that this is something that Apple is looking to do.

00:10:56   My feeling is if Apple are not willing to make their own first party controller they

00:11:03   They should support everything in the sense of have their own program that you can go

00:11:07   through like the MFI program, but also to allow support for other Bluetooth controllers

00:11:15   like the Xbox controller and the PlayStation controllers.

00:11:18   If Apple doesn't want to make something great that's first party, let me use my other great

00:11:24   games controllers to play games on the Apple TV as long as they have enough buttons for

00:11:28   it.

00:11:29   a minimum threshold or maybe they just bake in support for PlayStation 4 or Xbox controllers

00:11:36   and go with it.

00:11:37   I mean at this point Apple is clearly not taking games seriously on tvOS.

00:11:44   I mean we knew this was gonna happen, Apple is not you know by any standard gaming company

00:11:50   they just believe in the App Store and they're saying you know everything's fine on the App

00:11:54   Store you can have free games in-app purchases and you know it's fine.

00:11:58   not treating the tvOS apps or something more like the Nintendo eShop or the PSN or Xbox Live,

00:12:03   which is fine, a bit disappointing maybe, but not surprising. So at this point anything that makes

00:12:10   the experience better works for me, I mean give us anything really. When I have support for

00:12:15   PlayStation and Xbox One controller it's fine, you know, Apple is not doing any first-party

00:12:20   controller and obviously the PlayStation and the Xbox Ones are much better options than what you

00:12:26   you get from the Apple Store.

00:12:29   Even the Stratos controller is fine,

00:12:32   but it's not PlayStation, it's not DualShock, right?

00:12:35   So if you wanna have basic support for those controllers,

00:12:38   I mean, all right, give us support for those

00:12:41   because they're better.

00:12:42   - All right, this week's episode is brought to you

00:12:46   by TextExpander from SMILE, Simply Indispensable.

00:12:50   I want you to imagine a world in which you no longer need

00:12:52   to type the same text, the same phone number,

00:12:55   the same address, the same marketing copy, the same driving directions, anything that

00:12:59   you type regularly, imagine not having to sit and write it all out every time, imagine

00:13:04   just hitting a couple of keystrokes and all of a sudden everything is beautifully filled

00:13:10   out for you. This is the magic of a life with TextExpander. You can store anything you want

00:13:16   in a snippet, you can put text in there, you can put images in there, you can have fill

00:13:19   in the blank snippets, so you can have say a couple of paragraphs with a few elements

00:13:24   that you want to choose, you can type in blank into blank fields, you can select from dropdowns,

00:13:29   you can set all of this stuff up in TextExpander and you'll be saving yourself tons of time

00:13:34   every single day.

00:13:36   One of my favourite things in TextExpander is in the preferences, you can find out just

00:13:41   how much time you have saved in your life.

00:13:45   So I'm going to bring that up right now, they have a great statistics view.

00:13:49   I have expanded on my Mac 11,119 snippets, which has saved me 17.2 hours of my life with

00:14:01   not having to type stuff.

00:14:03   I am happy about that statistic.

00:14:05   Think of how many movies I could watch in those 17 hours.

00:14:08   That's what TextExpander has allowed me to do.

00:14:12   And now there is a sharing ability in TextExpander.

00:14:14   You can share groups of snippets with others.

00:14:16   You can expand your common knowledge within your team or group and keep everyone on the

00:14:20   same page.

00:14:21   Keep everything current.

00:14:23   If something's changed in a piece of marketing copy that's being sent out, everyone's going

00:14:26   to get it because it will be synced with everyone.

00:14:29   TextExpander now includes apps for Mac, iPhone, iPad and Windows, which is currently in beta.

00:14:35   You have all of your snippets on all of your devices all the time.

00:14:39   Subscriptions start at $40 per year and include all the apps and the TextExpander sharing

00:14:44   service as well with discounts for registered TextExpander users. Team subscriptions are

00:14:48   now also available and include organization, focus, snippet and team management, detailed

00:14:53   access control, consolidated billing and more. Boost your productivity and learn more at

00:14:59   smile software.com/connected. Thank you so much to TextExpander from Smile for their

00:15:03   support of this show and Relay FM.

00:15:08   So Steven, you have mentioned on a previous episode that you had joined the multi-pad

00:15:13   lifestyle and picked up a 9.7 inch iPad Pro to go along in the conjunction with

00:15:18   your 12.9 inch iPad Pro. I predicted that you would keep them both but I think

00:15:26   you're about to say that #Myke was not right about this as well today.

00:15:30   It is another instance of Myke was not right. I told you that I was not going to

00:15:35   stay in the multi-pad lifestyle, that it was a a mere season of experimentation

00:15:43   So yeah, so I stuck with the 9.7 and I think you have a bunch of questions for me that

00:15:50   in the document all seem very charged and angry even.

00:15:54   But in short, before we get to those questions, I only really spent time with the 9.7 because

00:16:02   I was setting one up for a consulting client and then they ended up leaving town.

00:16:08   So I was stuck with their iPad for a little while and it's nice.

00:16:16   And using it and the 12.9 side by side you really can see the differences.

00:16:21   Just last night I was at the Apple store and sat down at the 12.9 inch again just to make

00:16:27   sure I hadn't made a mistake.

00:16:29   And there's a lot to like about the 12.9.

00:16:32   I like both.

00:16:33   I think I just like the 9.7 a little bit more.

00:16:36   So what is your current situation then? Your own personal iPad ownership situation?

00:16:43   It is a 9.7 Pro. Ok, so you have bought one.

00:16:47   I sold the 12.9 and bought my own 9.7 Pro. Ok, so I'm very confused about all of this.

00:16:55   Me too. Because the 12.9 inch iPad changed your opinions

00:16:59   about working with the iPad. You have had 9.7 inch iPads forever.

00:17:03   12.9 showed you that there was work that you could do writing that you could do on the

00:17:08   iPad but now you've gone down a size.

00:17:10   Are you still going to work on the iPad?

00:17:13   I am.

00:17:14   So to revisit the discussion from I guess almost a year ago, it's hard to believe that

00:17:19   12.9 has been out you know nine months.

00:17:22   I really felt like that the big iPad Pro for the very first time unlocked the iPad from

00:17:28   a productivity perspective.

00:17:31   And thinking back on early days of this show and of the prompt, Federico doing his job

00:17:36   on an iPad Mini now seems completely bonkers to me because the bigger screen did sort of

00:17:43   open me up to that world for the first time.

00:17:47   And that's all still true.

00:17:48   I still am working on the 9.7, I'm still doing the same work I was doing on the 12.9, which

00:17:53   is very heavily focused on my writing.

00:17:55   A little bit of relay administrative stuff, but mostly writing.

00:18:00   But what came to be my overall feeling about the 12.9

00:18:04   is that the size and the weight made me

00:18:10   less likely to carry it around.

00:18:14   And so I, again, like Federico,

00:18:16   I've been using the iPad up and walking around the house,

00:18:21   so taking it into the kitchen

00:18:22   if I'm cooking lunch for myself,

00:18:23   and listening to something,

00:18:26   or even taking notes on something,

00:18:27   just as I do other things.

00:18:29   And the 12.9, I felt, again, this is all personal,

00:18:34   subjective stuff, really wasn't as portable as I wanted.

00:18:38   But what I still credit to the 12.9

00:18:42   and what I still believe today is that the iPad Pro

00:18:45   is a great machine for getting certain types of work done.

00:18:49   And for me, that's different from Umic

00:18:51   and that's different from what Federico wants.

00:18:53   But for me, it sort of took over this set of tasks.

00:18:57   9.7 is still plenty good for. In some ways this is I think a very worthwhile

00:19:03   experiment that if I had never used the 12.9 I don't think I ever would have had

00:19:08   the realization that I could do work on a tablet comfortably and a lot of

00:19:14   that's the 12.9 inch screen but what I really think is so much of that is just

00:19:18   the split screen and slide over stuff. I spoke about this on MacPowerUsers when I was on

00:19:22   their show a couple weeks ago about doing you know research for history

00:19:28   article and having YouTube up and then having notes up and then swapping YouTube

00:19:32   out for one writer and being able to work and have my notes always visible

00:19:36   all that's so possible on the 9.7 now it's smaller and the smart keyboard is

00:19:43   like a super big necessity because multitasking with the software keyboard

00:19:48   up is hilariously not good.

00:19:51   I mean you know your slide over app or your you know your app on the side is

00:19:54   very small at that point but I'm still working on it and I'm using the smart

00:19:58   keyboard and I'm still writing on it but the the benefit is is that I am much

00:20:03   more likely to pick it up and take it take it with me and you know I carry a

00:20:08   15 inch MacBook Pro as some listeners may know and the 12.9 iPad obviously

00:20:13   nowhere close to the size of the weight of the 15 inch but the the 9.7 is just

00:20:19   so so drastically smaller than my MacBook Pro that I am encouraging is

00:20:26   encouraging to just take the iPad with me and leave the Mac behind and so I

00:20:31   have answered your question I'm still doing the work I'm still still believe

00:20:35   in that work I still think that the 12.9 taught me that but I think there's

00:20:38   lessons, at least for what I'm doing, are still applicable on the smaller size.

00:20:44   So you're not finding that it had anything really to do with the 12.9 inch iPad, like

00:20:50   from a physical perspective to allow you to do the work that you wanted to do on the iPad.

00:20:55   It was maybe some of the advancements that came with multitasking and then a device that

00:21:01   could convince you to do it that has now allowed you to carry this thinking on into the smaller

00:21:07   iPad as well. I think that yeah I think you just said what I tried saying.

00:21:11   It's a much more concise way. It's just it's really interesting right because I

00:21:18   agree mostly with what you're saying because I am able to do the majority if

00:21:25   not all of the work that I do on any iPad on any iPad right like I'm able to

00:21:30   take the 9 7 and do the work that I would do on the 12 9 but it's just a

00:21:34   little bit more cramped but it's perfectly functional especially with an

00:21:40   external keyboard for me because it makes the split screen more useful with

00:21:44   the smaller device because the keyboard isn't covering the screen up but it's

00:21:50   for me I'm still picking up the 12 9 to do serious things in because I like to

00:21:57   do that. It's just very... I don't know, it just continues the ongoing struggles you have

00:22:05   with devices. I feel it more than anybody that I know, you buy and return and buy and

00:22:10   return so much stuff.

00:22:12   Quiet.

00:22:13   I feel like I'm in the opposite situation as Steven. So you know, I bought a small iPad

00:22:20   Pro when I was in San Francisco and I used it for a while to get an idea of iOS 10, but

00:22:28   since I put iOS 10 Beta 3 on my big iPad Pro, which is where I get all of my writing work

00:22:35   done, I haven't picked up the small Pro at all, even in those situations where before

00:22:42   I said "Yeah, it's nice to be able to have a smaller iPad", but it's just... as I feared,

00:22:49   I think the context switching for me doesn't really work, so it's nice to have a small iPad,

00:22:55   but I think it'll go to Sylvia real soon.

00:22:59   I've got to say, I am really not enjoying post-100 connected.

00:23:03   I'm wrong about everything.

00:23:06   I'm sorry, Myke.

00:23:08   It's just a slow realization that people are not you, you know?

00:23:12   It would be way nicer if everyone was just like me.

00:23:16   So, you're not finding yourself wanting to use the 9.7 inch to like read Twitter with

00:23:24   and stuff, like is that just not more comfortable for you?

00:23:26   No, because I'm either sitting down or, you know, being on the couch or in bed with the

00:23:34   big iPad Pro, or I'm using the iPhone, the 6x Plus.

00:23:40   And I think the small iPad, it's a really nice device and every time I pick it up I'm

00:23:45   "okay yeah it's a small iPad and it's lighter" but it's just you know I'm so used to proper multitasking on the 12.9 and I'm using it all the time

00:23:56   and then I have the iPhone so I have this other iPad and I'm like "it's nice but do I really need it?"

00:24:03   I mean I needed it for a month when I needed to test the iOS 10 beta but what about now? I got the beta on my iPad Pro

00:24:10   more iPad stuff for the big iPad Pro than the small one in iOS 10. So do I really need

00:24:17   it? I don't know. That's the situation that I'm currently in.

00:24:21   Steven, are you continuing to use a keyboard with the 9.7 inch?

00:24:27   I am. I'm using the smart keyboard, which is not as good as the 12.9, but it is perfectly

00:24:33   usable. I have found, like the other day, like the iSight article I published last week,

00:24:40   wrote on the iPad and I used a, you know, the aluminum magic keyboard for that. So

00:24:46   longer form stuff I will grab the magic keyboard. It just is a little bit more

00:24:51   comfortable but the smart keyboard is perfectly usable. It has all the

00:24:55   benefits of the big smart keyboard. Myke you've talked a lot about this that you

00:24:59   just always have a keyboard with your iPad you don't have to like go get it or

00:25:03   forget that it's charged or like you left it in your other bag.

00:25:06   It's just because it's always literally attached to it you always have it.

00:25:09   And that's still really great on the little one. It's what I have on there. I don't have a regular smart cover for it. The keyboard is always attached.

00:25:16   And I do like the software keyboard better on the 12.9 inch. I'm not sure I'm in the majority of that. I think a lot of people don't like it. But I really do like it on the bigger one.

00:25:28   And so it's a little bit of an adjustment there.

00:25:33   But yeah, using the keyboard and doing the writing thing and going from there.

00:25:40   So on upgrade this week, we spoke about the Razer mechanical keyboard because Jason got

00:25:45   one and effectively no bueno is the answer.

00:25:51   Yeah, I listened to that yesterday.

00:25:54   And if I have a mic I texted to you, I was using my iPad Pro in the car.

00:25:59   I had to listen to a podcast for a series of coincidences.

00:26:04   But yeah, it seems like that thing's not really all that great.

00:26:09   And yeah, it's fine.

00:26:10   Bluetooth is really good on iOS these days.

00:26:14   And I have a Magic Keyboard that is dedicated to the iPad, and so it's just in my drawer

00:26:20   on my desk.

00:26:21   That's that way I'm not having like pair between the computer and the iPad and dealing with that but um it's nice to have options

00:26:26   It just seems that the weather razor fails is that when you put the iPad pro in the case and?

00:26:34   Have the keyboard with it. It's heavier than a MacBook Pro

00:26:37   And I know what they're trying to do I appreciate what they're trying to do, but I think

00:26:45   That is not good like that's that's not a good end result is not a device which is as heavier than a laptop

00:26:53   I think you're kind of defeating the object. Oh

00:26:56   Yeah, I would still love to see more happen in the keyboard space for the iPad

00:27:01   I'm happy that you've finally settled on a

00:27:04   iPad for you even though I think that you're both making the wrong decision

00:27:10   But looks like the law of averages is going against me on this one

00:27:14   Yeah, and I think that um I think that overall the 9.7 inch iPad Pro is probably like the default

00:27:21   iPad for a lot of people that the big one is really useful for people like like you guys who are doing a

00:27:28   substantial amount of your work on it

00:27:31   But for people like me who are only doing some of their work on it or maybe even just using it for like media stuff

00:27:37   that size like there's a reason Apple started with that size and

00:27:41   I think that

00:27:43   if Apple had released the 9.7 and the 12.9 to iPad Pro at the same time, I'm not

00:27:50   sure I would have ever tried the 12.9. I don't know, I mean it's, you can't, it's

00:27:56   hard to say in hindsight, but the 12.9 iPad Pro came out first. I think made, you

00:28:02   know, put a bunch of people in a situation where they were willing to

00:28:05   experiment with it. I know Marco Arment went through this as well. He used the 12.9, didn't like it, but

00:28:09   really likes the 9.7. So I think there's that weird cycle of the iPad that we

00:28:15   talked about played in favor of the 12.9 for a lot of people but it'll be

00:28:20   interesting to see moving forward how that how that pans out. I definitely see

00:28:24   12.9 inch iPads. I saw one just the other day at a coffee shop someone was

00:28:28   using one but I don't know overall if it's ever going to become like the most

00:28:35   popular version. I think that 9.7 is a really good size, it's a really good iPad, and I

00:28:42   think that people, it's good to have options, but I think people will kind of stay in that

00:28:47   size range more. Last week we were talking about Apple and

00:28:55   data in iOS 10, and there was another part of this puzzle that we didn't get to, which

00:29:01   is photo data and how Apple is dealing with photos in iOS 10.

00:29:06   So to kind of put this in a little bit of context here,

00:29:13   the idea of looking at faces, understanding

00:29:17   what face looks like, looking at a horse, looking at a mountain,

00:29:20   knowing what horses and mountains look like,

00:29:22   this stuff has to come from somewhere.

00:29:24   Apple has to train a system that can get better

00:29:28   and can detect this stuff.

00:29:30   Now, with where we are right now,

00:29:34   if we look at Google, for example,

00:29:36   Google has incredible amounts of data about photos

00:29:41   because they own and run Google Image Search.

00:29:44   And I'm sure that over the years

00:29:46   as they've been running Google Images

00:29:48   and refining their searches and building up metadata,

00:29:51   they are sitting on a bank of incredible data

00:29:54   of stuff that looks like something, right?

00:29:57   They know it's their data, right?

00:29:59   And also I'm sure, I mean,

00:30:01   and I don't know the full privacy stuff in this,

00:30:04   but when you upload your photos to Google Photos,

00:30:07   it's helping to further train that system.

00:30:11   I don't believe that it's tagging Myke Hurley

00:30:14   has a photo of Steven Hackett

00:30:16   and they have piles of cash in front of them

00:30:18   and that's being saved on Google.

00:30:20   - Do you?

00:30:21   - If it was, I wouldn't want people

00:30:23   to see that picture, right?

00:30:24   That's being uploaded to a server somewhere.

00:30:28   I don't think they're doing that,

00:30:29   but I think they are understanding like,

00:30:31   two guys wear glasses, cash table.

00:30:35   They're like, understand that,

00:30:36   and that photo helps to further categorize that.

00:30:40   I assume that they're like picking out key elements

00:30:42   so they can try and draw what that thing looks like.

00:30:45   I mean, I'm sure there's lots of scientific

00:30:47   and smart ways to explain this,

00:30:49   but that's the way that my brain thinks

00:30:51   of this stuff through.

00:30:52   Now, Apple has no starting position of their own, right?

00:30:57   There is no Apple image search,

00:31:00   so they have no data that already exists

00:31:04   that can help them with this.

00:31:06   And then furthermore,

00:31:08   based on what Apple is saying that they are doing,

00:31:12   I can't imagine that they are doing that analysis on images

00:31:16   and then sending that somewhere where it's saved.

00:31:19   Like I don't imagine that they're looking at mine

00:31:22   and looking at Federico's and looking at Steven's photos,

00:31:24   working out what's there and then uploading that data.

00:31:27   Am I right in thinking that Federico? I feel like it's important to establish this.

00:31:30   Yeah, I feel like that's obvious. That's a given, especially when you consider what Apple

00:31:35   has said about differential privacy and what they're doing with on-device analysis of pictures.

00:31:41   And when you consider the opposite of what Google is doing, not only do they run Google

00:31:47   image search, but they can also analyze what people prioritize. So when you use Google

00:31:52   image search indirectly or maybe you don't know but you're making a direct contribution

00:31:59   to training the algorithm. So if you search for pineapple and you click on the pictures

00:32:05   that according to your opinion best represent a pineapple, you're training the Google algorithm,

00:32:10   you know the intelligence, to be able to say okay X number of users have said that this

00:32:16   is a correct picture of a pineapple, therefore we should train our algorithm to decode more

00:32:21   pineapple images like this. And not only that, but even when you use CAPTCHAs on Google to

00:32:26   authenticate yourself and say "I'm not a robot" and they ask you with the modern CAPTCHAs to

00:32:31   start... For example, two days ago I got a Google CAPTCHA saying "Choose three pictures

00:32:37   of construction equipment" and there were like a bunch of different types of vehicles

00:32:44   and sure there were a few construction ones and I picked them and I can only imagine Google

00:32:49   is using this data, so on the surface I'm trying to say "I'm not a robot, please choose me, because

00:32:55   see I can pick pictures of construction equipment", but under the hood, in fact, I'm training the

00:33:00   Google algorithm to better understand what is going on. And I believe that's why Google

00:33:05   photo search is so accurate and so amazing. On the other hand, Apple is not doing any of this,

00:33:10   they don't have Apple image search, they maybe have stuff like Siri when you ask for pictures

00:33:16   of horses and they show you snippets from Bing and maybe they could look at what users

00:33:21   tap in Siri, but I don't think they are.

00:33:25   And so it begs the question, where is this data coming from in iOS 10?

00:33:29   How do you know what's a tree?

00:33:32   What's a boat?

00:33:34   What's a beach?

00:33:35   What's a pizza, for example?

00:33:37   Where does this data come from?

00:33:40   And we can only infer that Apple is doing some kind of partnerships, some kind of collaboration

00:33:47   with other data sets that don't come from users.

00:33:51   So let's establish that there must be a data set that comes from somewhere.

00:33:56   And let's establish that if you leave the users aside, that leaves either employees

00:34:01   of Apple who collaborate by sort of crowdsourcing their own pictures, or it comes from third-party

00:34:08   companies.

00:34:09   I'm not trying to be a genius here, it's the only options left.

00:34:12   So if we can infer...

00:34:14   Or they have actually created a computer that can look at images and, you know, it's a brain.

00:34:19   Then again, can you create a brain, Myke?

00:34:22   No.

00:34:22   Really?

00:34:23   I mean, but like, that's the only other option is they have a machine that can do it.

00:34:26   But that doesn't... I mean, we will just naturally assume that they have not done that.

00:34:30   Because I think even Apple, as secret as they are, would probably want to tell the world

00:34:34   that they've created a computer that can think all by its own.

00:34:37   We have created a robot that can go out in the real world and order pizzas and ride horses.

00:34:43   That leaves the two simplest explanations, which would be Apple has a set of internal images

00:34:49   or Apple has bought images from someone else.

00:34:52   And I believe it's two options that make sense, right?

00:34:56   It's a company that has the money to go out for any kind of image provider and say,

00:35:02   look, we want to look at like 3 billion images,

00:35:06   and we want to be able to look at the tags

00:35:09   that you use to describe those images.

00:35:10   And we want to be able to buy this database

00:35:13   and use it and train our algorithm for now,

00:35:16   because we have no knowledge of what a horse is.

00:35:18   As sad as that sounds,

00:35:20   you know, how can you not know what a horse is?

00:35:22   Technology is a bit different than emotions in that sense.

00:35:25   So you must be looking at database and tags and image data

00:35:29   to understand what is going on.

00:35:31   And that also explains why in photos in iOS 10, there's a certain limitation of the categories of content that you can look for.

00:35:40   I saw someone had... I have no idea how they did it, but there's like 4,000 entries,

00:35:46   like possible combinations of keywords that you can put in the search field and return images from.

00:35:52   And, you know, it seems to me when you consider the two simplest explanations,

00:35:58   or the Occam's razor, if you look it up on Google, you will see what it means.

00:36:03   If you consider the simplest explanation of what is going on,

00:36:07   if you consider Apple's money, and if you look at the implementation in iOS 10,

00:36:11   that's the only two conclusions I can reach.

00:36:15   There must be some people inside Apple providing their own photos,

00:36:18   and Apple must be buying photos from someone else who is not the user base.

00:36:23   Makes sense? I wonder how well it will work for real life situations of people who are not...

00:36:31   I don't run a stock image database, I just go out in the world and take pictures of pastimes and beaches, but we'll see.

00:36:40   I mean, for now, it works okay, it works alright, it's not as amazing or jaw-dropping as Google search,

00:36:47   Google Photos I mean, and it's not flexible of course, you cannot search for emoji for example.

00:36:53   And I wonder how well the privacy conscious approach of Apple with Sync and not wanting to

00:37:02   store user data in the cloud will work over time. For example, let's say that I have my iPhone 6s,

00:37:10   it's doing a bunch of data analysis with the computer vision, whatever is the name,

00:37:14   and they're looking at faces, they're looking at my stories inside of my photos,

00:37:20   so they know that there's a few restaurants, a few pictures of the sea,

00:37:25   but what happens when I sell my iPhone and I get a new one?

00:37:28   Do they sync with iCloud?

00:37:30   Right now, Apple is saying the current beta virus 10 doesn't sync your face data across devices,

00:37:36   so maybe that sounds like face sync will come eventually.

00:37:39   And my understanding is, as far as the scene recognition goes, or the object recognition goes,

00:37:47   when you buy a new iPhone, you will start from zero, and as soon as you charge your iPhone overnight, leave it plugged and connected to WiFi,

00:37:56   it'll do its own processing and you will end up recreating the same scenes and the same object recognition,

00:38:03   but you will start from zero again,

00:38:05   because Apple is not storing any of this metadata about you

00:38:09   in your iCloud account.

00:38:11   - What do you reckon the success rate of that re-scanning is?

00:38:15   Reckon it's 100%?

00:38:17   Because this is what I'm thinking, right?

00:38:18   Like I imagine I knew I could search for X image,

00:38:21   I got a new phone and I can't find it anymore.

00:38:24   I can't be 100%, right?

00:38:26   I mean, maybe it's so good,

00:38:28   the likelihood of you ever hitting that problem

00:38:32   is so slim, right?

00:38:33   But if it's analyzing every image every single time,

00:38:39   I wonder if it would get every single thing in it correct

00:38:42   every single time, if you keep doing it device to device.

00:38:45   I do want to read very quickly, underscore in the chatroom,

00:38:48   put a quote in from Craig Federighi from the talk show Live.

00:38:52   So this was about kind of where the data comes from.

00:38:55   And he said, the other thing is there's this idea that,

00:38:57   well, if you don't have the data, how would you ever learn?

00:39:00   Well, turns out, if you want to get pictures of mountains, you don't need to get it out of people's personal photo libraries.

00:39:06   Like, we found out that we could find some pictures of some mountains.

00:39:10   Yeah, I'm sure you did.

00:39:12   I mean, that again follows the theory that if you leave the users out of the equation, it leaves either Apple people or other companies.

00:39:24   And I mean, not necessarily you need to go to something like Shutterstock or get images

00:39:32   and say "look, we have a lot of money, we want to buy all of your pictures".

00:39:37   You can go to any kind of public library database and look at public photos, public domain stuff.

00:39:45   I'm sure Apple has internal databases of pictures they've used over the years for stuff like

00:39:50   iPhoto demos or iPhoto marketing materials, you know, those kind of pictures.

00:39:55   In fact, funny story, I once got an email from someone who knew a family who had

00:40:02   this sort of contract with Apple to provide pictures of their kids and their

00:40:07   dog for the iPhoto team and I was very sad when this person emailed me a few

00:40:13   weeks ago to tell me that the dog passed away but it's definitely like a real

00:40:17   thing to provide photos to Apple of your own vacations. It's basically like you

00:40:21   don't want to see an actor but you're shooting pictures and taking videos for

00:40:26   Apple to provide them with this data to use on the website, in keynotes, you know

00:40:31   stuff like that. So Apple has its own set of images either from this sort of

00:40:36   marketing purposes or from employees and then they can go out to other companies

00:40:39   public domain organizations and look at that data and sure you don't

00:40:46   need to look at private pictures from Myke and it's piles of cash, you can look at piles

00:40:52   of cash from other people, I guess.

00:40:54   Like you?

00:40:55   Not really, but in theory, yes.

00:40:59   How does Apple make their algorithm better, and how does my phone get that data?

00:41:10   Exactly, so this is a discussion we were having, me and you and I think Gray at WWDC, so we were talking

00:41:17   what if a new object is created in the world, a human-made object and Apple doesn't have this information?

00:41:26   A good way to, we were going back and forward on this quickly, a good way to think of this is

00:41:31   imagine the PlayStation 5, right? Like we're not talking about the creation of a new plant

00:41:37   but something that will be iconic immediately and people will know what it looks like.

00:41:43   Like I say you take a picture of the one that you just bought and you want to find it.

00:41:47   So I just wanted to put that because this is something when we were talking about this

00:41:50   meaning new we went back and forth on this like what does that look like?

00:41:53   But imagine a new product that is for sale, right? As a new thing created by man.

00:42:01   Yes, so Google of course has an advantage because they can look at data from websites

00:42:08   indexed in Google search, provide those image results in Google image search, and as soon

00:42:14   as people click on those image results, say from the PlayStation 5, they can know "ok,

00:42:19   this is a product called PlayStation 5" and in Google Photos you should be able to search

00:42:23   for PlayStation 5 and your picture comes up.

00:42:26   Apple has no web engine to look at data from websites and say,

00:42:33   "Okay, there's pictures of this new product, now let's bring it back into photos."

00:42:38   It's different, right?

00:42:40   And to do this kind of quick update to the algorithm,

00:42:46   it would require Apple to have these databases be constantly updated with new information.

00:42:51   And again, it's not impossible because if you go to, you know, when there's any public event,

00:42:56   you're gonna find photos on Getty Images really quickly after the event, because they have

00:43:02   photographers going to these events and taking pictures. And I assume this also works for

00:43:08   marketing materials. I mean, you could probably have a deal with companies like BusinessWire,

00:43:14   companies that publish press releases that often include text and photos of new product and new

00:43:23   new product announcements, you could probably make a deal with those companies and say,

00:43:27   "Okay, every time a new company announces a new product, we want access to that text

00:43:31   and that picture in a database-like object, and we want to analyse that."

00:43:35   It's not impossible, especially when you have a lot of money.

00:43:38   When you have a lot of money, you can buy a lot of stuff.

00:43:40   This seems functionally more complex and expensive.

00:43:44   Yes, it's a very convoluted way to approach image analysis, I think.

00:43:50   But it also, on the other hand, it's a very respectful way maybe to say, look, we want

00:43:57   to give you the utility of searching for your photos, but we don't want to look at your

00:44:02   photos.

00:44:03   And I feel like a lot of people, including me from a certain perspective, they're comfortable

00:44:07   knowing that Apple generally cares about not looking at your stuff individually to improve

00:44:14   an algorithm that benefits everyone.

00:44:16   I really think it's just a personal opinion thing because I mean I don't think that there's

00:44:23   a Google engineer looking at my pictures.

00:44:25   It's just a different computer, it's just somewhere else.

00:44:28   You're playing on principle here, right?

00:44:31   And everybody is entitled to feel the way that they want to feel and it's really nice

00:44:36   that Apple is providing a choice to the people that don't ever want that data to leave their

00:44:42   devices.

00:44:43   Like it's definitely good for that.

00:44:45   I do feel though that you're making a trade-off

00:44:48   and some of the trade-offs will come in the functionality.

00:44:51   'Cause I just personally cannot conceive of a world

00:44:56   where Apple stuff could ever be as good as Google's.

00:44:59   I'm not saying it though it even needs to be.

00:45:01   There is a level of good enough.

00:45:04   I don't know what that level is.

00:45:06   But there is a level of good enough.

00:45:11   My brief experiences with photos so far,

00:45:16   it's not good enough.

00:45:18   So one of the places that I've found

00:45:21   some serious issues with is faces.

00:45:24   The face recognition stuff, it's not very good right now.

00:45:29   So I'll give you an example

00:45:31   of why I think it's not very good.

00:45:33   It picked out pictures of Adina

00:45:36   and put them into multiple different people.

00:45:39   - Yes.

00:45:40   The one that I found the most ridiculous

00:45:42   was eyes open, eyes closed.

00:45:44   If a picture of Idina of her eyes closed,

00:45:47   like say we were out in the sun,

00:45:48   and I took a picture of her and she was like squinting,

00:45:51   it couldn't recognize that they were the same person.

00:45:53   And that folder was just about 12 pictures

00:45:56   of her with her eyes closed.

00:45:58   It just couldn't match that they were the same people.

00:46:01   And that, I'm sure that the reason is like,

00:46:05   because the way that facial recognition works

00:46:07   is to pick out those key elements on a face,

00:46:10   which is like nose, eyes, and mouth and stuff.

00:46:14   But it really feels like an advanced

00:46:16   facial recognition engine should be able to detect

00:46:20   between eyes open, eyes closed.

00:46:23   And I just think-- - From the same person.

00:46:26   - From the same person.

00:46:27   And I just think that that is not,

00:46:30   that is not sufficient, right?

00:46:32   I just, so I'm opening, I'm opening Google right now,

00:46:36   and I can only see one entry for Adina.

00:46:41   So it either means there are many, many, many photos of her

00:46:45   that it's not picking out and it's just not showing me them

00:46:48   or it's getting them right. - They haven't seen yet.

00:46:51   - No, all my photos are in Google Photos.

00:46:53   - Oh, okay. - Right, okay.

00:46:54   And I've just found one of the images with her eyes closed

00:46:58   that I remember Apple was picking out

00:46:59   'cause I remember when it was from.

00:47:01   And it's in the field for who Adina is.

00:47:05   So it is at least doing that.

00:47:08   And the other thing,

00:47:08   I'm looking at when I was looking at my photo library

00:47:10   on my iPad, there are so many entries for the same person.

00:47:15   And the way that you merge them is infuriating.

00:47:20   You have to go in every time.

00:47:22   So I will name Federico.

00:47:24   It pulls up from my contacts.

00:47:26   I like that.

00:47:26   It's like, great, you know that person is Federico.

00:47:28   I go to the next one.

00:47:29   I have to type it again.

00:47:30   Type in Federico, click it.

00:47:31   It's like, would you like to merge?

00:47:32   Okay, merge.

00:47:33   Now go to number three, Federico.

00:47:36   I'm not gonna do that.

00:47:37   And do you know why I'm really not gonna do it?

00:47:38   Because I have to do that on every device that I own.

00:47:41   - Yeah. - Because they don't sync yet.

00:47:43   That's the problem. - That is unacceptable.

00:47:45   Wi-Fi sync, just do Wi-Fi sync or something.

00:47:48   I know that I'm an edge case

00:47:52   'cause I have three or four iOS devices,

00:47:56   but then I've also got my Mac.

00:47:58   So let's say many people will have three or two.

00:48:02   you don't want to go and do that on every single device.

00:48:04   And then is it useful then like,

00:48:06   because I'm now not gonna ever use the faces in iCloud

00:48:11   because it's not going to be complete.

00:48:14   The information isn't going to be complete. I just, I'm not happy with it.

00:48:19   And also like the places view, um,

00:48:22   all it has for me so far and I've had it chewing on my photo database for,

00:48:26   you know, on, on my iPad air 2 since the first beta. And I put, um,

00:48:31   I put it on my iPad Pro, the 12.9, a couple of days ago

00:48:35   when the third beta came out.

00:48:37   So it's had enough time to look at my memories, as it were,

00:48:42   but all it has, oh, I just opened my iPad,

00:48:45   and because I hadn't done it on this iPad,

00:48:47   it now won't show me.

00:48:49   But I looked, 'cause I have to rebuild the database again.

00:48:52   But yesterday, I looked at it, and all it had

00:48:54   was my one trip to New York last year.

00:48:58   yeah yeah mine's been on my pad since beta 2 and it's there's only a couple of

00:49:04   memories built there and compared to Google Photos does I just opened Google

00:49:08   Photos as well and it's showing me stuff from like two three four five and six

00:49:13   years ago today is showing me a trip it's showing me all the stuff they do so

00:49:17   much better and it's really it's really night day and I I see what you say Myke

00:49:23   about there may be a good enough because I think the Google Photos is sort of a

00:49:28   nerdy product. I don't know how mainstream it is. I know it's sort of

00:49:31   baked into Android so maybe a lot of people are using it but it doesn't seem

00:49:35   like it is as well known, at least to our community, as photos is of course and it's

00:49:42   kind of a shame because Google Photos is a really good job. I backup Google Photos

00:49:46   A) for a backup but B) really for this memory stuff because it's so good.

00:49:51   I do it for search stuff only. I have a bunch of duplicates in there because I set it up kind of wonky.

00:49:56   like if I look at it on my phone, it's pulling from a database but also trying to use my phone

00:50:01   But I don't think that but I purely use it for the memory stuff because it's nice

00:50:06   And for searching of images and it does a great job of that

00:50:09   But like I just looked at my iPad then and there were two different memory albums for my one trip to New York

00:50:16   one of them that was in London and one of them that was in Romania and

00:50:20   And when I look at Google Photos, I have endless lists of every single time I've ever left the country,

00:50:26   ever left from home. Like I've got stuff going back to 2011 in here. And they're like just the

00:50:32   ones that I've decided to save because what Google does, which I like, is they're like,

00:50:36   "Hey, we've recognized that this might be a thing that you did. Would you like to save this?"

00:50:41   So it does that, right? But they're the ones that I've chosen to save. But I feel like the Apple

00:50:47   memory stuff is not picking up like any of my WWDC trips that I take every single year.

00:50:54   And I don't know if like, am I still multiple weeks away from completing the search of my

00:51:00   images?

00:51:01   So I think I'm doing a lot better than you guys.

00:51:06   So one thing that might be about that is I have optimized storage on my devices.

00:51:11   Me too.

00:51:12   Me too.

00:51:13   Okay.

00:51:14   But that just made me think, what happens to the images that are not downloaded?

00:51:16   Are they checked?

00:51:18   They have said that they use the thumbnail data.

00:51:20   That that's enough. I think that was in a

00:51:22   maybe the state of the union.

00:51:24   Someone probably probably knows.

00:51:26   But I believe they've said that the thumbnail

00:51:28   is enough data for them to

00:51:30   do their process.

00:51:32   So I'm looking at my memories right now.

00:51:34   And iOS created the first

00:51:36   so I put beta 1 on my phone

00:51:38   as soon as it came out.

00:51:40   As soon as I was back in Italy actually.

00:51:42   So I was back in Italy on June 19th

00:51:44   19th and the first set of memories was on June 20, three memories.

00:51:50   And I'm scrolling through and basically iOS I would say created four to five memories

00:51:57   each week.

00:51:58   So I got the latest one today, actually two new memories today, one yesterday, another

00:52:06   earlier in the week.

00:52:08   So I would say yeah, about four or five new memories every week.

00:52:11   And they're kind of nice, you know?

00:52:13   does a good job at creating memories for locations, for stuff like weekends.

00:52:18   I see what it's doing and I don't like this implementation. It's just giving me random

00:52:24   things on different days, right?

00:52:26   Yes.

00:52:27   See, I don't think that makes sense.

00:52:29   So what I know is there's going to be some fixed type of memories. So there's an on-this-day

00:52:36   one that is often created but not every day.

00:52:40   There's also a birthday memory that's all about you that will be placed in the memories view on your birthday.

00:52:49   I have a to-do to check on my birthday if it comes up.

00:52:52   And yeah, that I think are the two regular ones.

00:52:58   Alright, so I'm clicking through this now and I think that basically my issue now is not the data, it's the UI.

00:53:05   Because if I click into a memory, one of the memories it's given me, I get related memories

00:53:11   that I haven't seen before.

00:53:13   But why is it...

00:53:14   I don't understand, why do I only get access to like one of them every day if the device

00:53:20   has done them?

00:53:21   So maybe the memories that you see in the main screen are like highlights from your

00:53:26   memories.

00:53:27   But it's like picking out completely random things.

00:53:29   It's not showing me like this time last year.

00:53:32   Like it's showing me from Saturday stuff that happened in like November 2015.

00:53:41   I don't know how it works.

00:53:43   I will try to confirm details.

00:53:45   So I'm now going to change my complaint to a UI complaint.

00:53:48   Because I'm like I can see oh here's my March 2016 trip to Texas.

00:53:52   But to get that I had to click into two different memories and go to related.

00:53:56   If you're doing this why can't I just see it all?

00:54:00   Yeah.

00:54:01   - I mean, Google puts it all in line.

00:54:03   - Yeah, so it's just a list chronologically

00:54:05   of all the memories that I have.

00:54:07   I do like the places view though,

00:54:09   I will say that at least.

00:54:09   That's one good thing that I have to say.

00:54:11   I'm happy the places view is there.

00:54:13   Seeing my photos on a map tends to be a good thing

00:54:15   that I like.

00:54:16   - Yeah, there's a lot of information that gets hidden

00:54:20   by the photos app and you need to click through

00:54:23   into the related stuff.

00:54:25   I think I'm a fan of photos.

00:54:30   So my idea is, we just talked about on remaster, we talked about Pokemon Go and you made a point of

00:54:38   it's not a really advanced Pokemon game in the sense that you cannot do a lot of the more complex Pokemon stuff with evolutions and customization

00:54:50   but it's good enough for a lot of people to get into.

00:54:53   And I think you can draw a comparison with what Apple is doing with photos.

00:54:57   It's not as advanced as what Google Photos can do.

00:55:01   Maybe it's good enough for a lot of people to get started on it, to not get the impression

00:55:07   that Apple is being creepy, just being useful in a very small dosage.

00:55:14   Maybe Apple is going for it.

00:55:16   Yeah, I think there's still some work that needs to be done.

00:55:22   what I thought about the places stuff isn't as bad as I thought, right? Like there is

00:55:26   stuff that's happening, it's just not being shown to me. But the things like with not

00:55:30   being able to sync this stuff from device to device, I think makes it worse, especially

00:55:35   with the faces stuff. You know, for me, I just can't, I can't help but compare this

00:55:40   to Google Photos, because I find that implementation to work really well. I know that it's, it's

00:55:50   It's like effectively a different product because of the way that the data is being

00:55:54   found.

00:55:55   They're so fundamentally different, but unfortunately they position themselves, Apple has positioned

00:56:02   iCloud photos to be like Google Photos.

00:56:06   They've given you all the features.

00:56:08   So for me, I can't help but compare them because it's the same data that I have going in, but

00:56:14   it's completely different results coming out the other end.

00:56:19   Of course the interface can still be fixed by the final release.

00:56:23   I don't think it will change a lot because Beta 3 is still the same so it's kind of an

00:56:27   indication that it's staying as it is right now.

00:56:31   And I think it could be clarified, especially the need to tap through to see related memories.

00:56:39   It's basically just a vertical list of random memories.

00:56:42   There's no schedule, let's say, of memories.

00:56:49   It's basically meant to be like a daily or almost daily surprise.

00:56:54   You open photos and you see a new memory.

00:56:57   From a serendipity point of view I maybe come by the argument of being surprised every day

00:57:03   and seeing random memories show up because if it becomes fixed and more precise maybe

00:57:09   becomes boring, whereas maybe Apple wants to go for the surprise effect of taking this

00:57:14   random weekend from two years ago and bringing it back to you.

00:57:18   I can imagine the random person being "oh yeah, I forgot about their Saturday when I

00:57:21   got so drunk people took pictures of me on the ground".

00:57:24   But you know, not everyone gets drunk and gets pictures taken from other people and

00:57:30   maybe some better organisation could be needed.

00:57:33   I don't know.

00:57:35   You know what I'm saying, Myke?

00:57:36   Yeah, I completely agree.

00:57:38   I completely agree.

00:57:40   I'm interested to see, as I say, how it goes like you did.

00:57:44   The fact that, as you say Federico, the UI hasn't changed probably means the UI won't

00:57:48   fundamentally change.

00:57:51   And I wonder about how scalable this is.

00:57:55   Like, it just feels like Apple is throwing like a ton of money at this problem.

00:57:59   If what you believe, and I am inclined to believe your thinking on this is true, that

00:58:04   they're just buying photos, like, at what point does it just become...

00:58:09   more likely that Apple is buying photos or that they've created God. They're definitely

00:58:16   buying photos. But you know at some point there has to be some realism coming into play.

00:58:25   There are some subreddits where people think that Apple do that every day. But I just feel

00:58:36   like at a certain point you've put so much money into this. I wonder if it's really a

00:58:42   competitive advantage. Like is the privacy really working for you that well? I don't

00:58:48   know. I don't know. I mean that's that's kind of like that's always the big the big rock

00:58:54   with any of these services from Apple that it it works the way it does and is engineered

00:59:00   the way it is because of the privacy and like that that is a fence post that they were not

00:59:05   willing to move and I commend them for it even though I use Google services every day

00:59:09   for personal work stuff.

00:59:10   I'm happy someone's doing it.

00:59:12   Great.

00:59:13   I am too but I think what you're getting at is you have to balance that right and the

00:59:20   privacy stand as good as it is as much as I appreciate it does mean that something like

00:59:25   photos may not be as good as what Google can do because Google can look at your photos

00:59:29   directly and that is a personal choice you have to decide which service you want to use

00:59:34   or be like us and use both. But it is something to consider that in talking

00:59:40   about Apple services and in using them and comparing them to Google or anybody

00:59:44   else, that is always what it comes back to. Like iCloud mail is not as good as

00:59:48   Gmail because Gmail can do more things with your messages because they can see

00:59:52   more. And that's just a, you just have to understand that that's part of the

00:59:56   equation. And you know, would Apple be able to build a better photo app if they

01:00:02   could see my photos directly? Probably. But at the same time this is not who

01:00:08   they are and it's not you know what the company's about and it's just something

01:00:11   that we all have to kind of reconcile in choosing a service. The privacy angle and

01:00:17   you know how well does it work? How much data can it know about me and and what

01:00:22   do I gain from that? You know we spoke a really long time ago on the show about

01:00:25   we are willing to give Google data because of the benefit that we get out

01:00:30   of it and that is something that I think we continue, all three of us continue to

01:00:36   stand by and for some people that bar is just different places. Here's a simple

01:00:40   final question to ask. Would a smart service more like Google Photos help

01:00:47   Apple sell more iPhones or is a better photos app with some simple intelligence

01:00:53   or moderately advanced intelligence good enough to sell iPhones?

01:00:58   I don't know the answer to that question.

01:01:03   Me neither.

01:01:04   Because I feel like it could be both and that's not an answer.

01:01:09   I like those types of answers. Think about it.

01:01:13   Because the thing is, both Apple and Google believe they're doing the right thing to sell more devices.

01:01:19   That's why it's so difficult. Both of those companies believe that the route that they're taking will help them sell more.

01:01:25   I would disagree. I might even put a word in your mouth, but I don't think Google's primary objective through something like Google Photos is to sell more devices.

01:01:35   Google wants to organize the world's information, and them understanding what an object is when they see it is important to their overall mission.

01:01:44   Now Google Photos is a great service and because of its integration with Android it may help.

01:01:50   Sell more Android phones, I don't know.

01:01:52   Well let me refine that then. Each company thinks that they're doing the best thing to make the most compelling photo services.

01:01:58   There you go.

01:01:59   But for Apple that means more devices because the only way you can use Apple services is on their devices.

01:02:05   And Google just wants as many people as possible in the world to use Google Photos.

01:02:09   So they both think that they have the best thing to get the most people.

01:02:14   It sounds funny but he has a point. Chris Hannah in the chat room is saying Google

01:02:18   also helps Apple sell devices as well with Google Photos.

01:02:22   Yeah, I don't know if Apple likes that so much, right?

01:02:25   Well it's possible, so you know. It's interesting when you think about it because again we are

01:02:34   constantly looking for more features, more advanced functionalities,

01:02:41   but at some point there's a threshold where the market, which is not the real

01:02:46   FM staff, reacts to good enough in a way that maybe we cannot imagine. And maybe

01:02:54   that threshold is met by photos in iOS 10. I don't know.

01:02:59   Yeah, I mean, there are going to be people who see this in iOS 10 and are going to be

01:03:05   completely blown away and smitten with it.

01:03:07   And that's great.

01:03:08   I mean, I like that it's built in.

01:03:11   I agree with Myke.

01:03:13   There's some issues with it still, but it's still a beta.

01:03:16   But there are going to be a lot of people who don't know about Google Photos.

01:03:19   Set all the privacy stuff aside.

01:03:21   They just haven't heard of Google Photos because it's just one service in Google's large portfolio.

01:03:26   this is going to make their experience better and if Apple ever gets the

01:03:30   syncing and metadata working or they're willing to cross that line it will make

01:03:34   it even better again and that's that's really I think where Apple services in

01:03:39   general sort of fall for me that something like Apple news a lot of

01:03:43   people are going to like it has a large user base podcast app has a large user

01:03:48   base but there there will be people and they happen to be the three of us and

01:03:52   people who listen to these shows who do want more or need more, honestly, and

01:03:57   that's where more professional, more robust services exist. But Apple is aiming

01:04:03   for the mainstream and just because it doesn't meet our needs doesn't mean

01:04:08   it's not a good product or a good service. Something like Apple News has lots of

01:04:11   users even though it's not for me. And I think that's an important thing to

01:04:15   remember. It's something that's easy to forget when we're just talking in the nerd circle

01:04:18   about this service versus that service, a lot of people just are going to use what's

01:04:23   built in. And so Apple's focus to make the built-in tools better is where they

01:04:28   should be spending their time, honestly. As much as I would like iCloud email to

01:04:33   have a bunch of server-side rules like Gmail does, most people just care about

01:04:38   the experience of using webmail and they want it to work well and they want it to sync

01:04:41   and iCloud mail does all that. And again they're aiming for the mainstream.

01:04:46   and a lot of, I will say, old Mac nerds in particular,

01:04:51   get sort of angsty about that, right?

01:04:54   Like there's not a Mac Pro, Final Cut Pro X isn't good,

01:04:58   like there are people who see Apple moving to the mainstream

01:05:01   and don't like it, but the simple reality is,

01:05:05   Apple has done that and has been hugely successful

01:05:09   for the company and they're going to continue

01:05:13   to make products and make services,

01:05:15   make cloud services for people in the mainstream and that's a new thing

01:05:19   right there there's still a lot of people who aren't using any cloud

01:05:22   services are very few and the only ones maybe they are using are the ones that

01:05:26   are sort of like baked into iOS so something like iCloud backup just switch

01:05:30   you turn it on and it may be the only cloud services someone uses but because

01:05:35   it's built in because it's from Apple they feel like they can trust it and

01:05:38   they are making it you know making inroads into the the average consumers

01:05:43   habits and lifestyle.

01:05:45   And so for a lot of people, photos on iOS 10

01:05:49   and on MacOS Sierra are going to be the first time

01:05:54   that they see something like this.

01:05:55   And for a lot of people that'll be all they want

01:05:57   and for some other people it'll be like reading lists

01:05:59   and Instapaper of like, oh, I'd like to save bookmarks

01:06:01   for later but I want a little bit more.

01:06:03   And then they go and shop for something else.

01:06:05   So I think it's just another example

01:06:07   in that long line of examples.

01:06:12   Kyle's the Gray in the chat room has pasted an Artcore,

01:06:15   VentureBeat Artcore in May, where Google says

01:06:17   that you have 200 million monthly active users.

01:06:20   So there are definitely some people, right,

01:06:23   that are finding Google Photos,

01:06:24   but that point that you make there, Steven,

01:06:27   cannot be discounted.

01:06:28   Like, it is one of the great things

01:06:30   about having it built into a device like this,

01:06:32   because now people will get something

01:06:34   that they didn't know that they wanted.

01:06:36   And for those people, a lot of these features

01:06:39   will be good enough straight out of the gate, right,

01:06:40   because this is their first try into this stuff

01:06:43   and they'll just be maybe delighted

01:06:45   with the things that they find.

01:06:46   But then I think a lot of people will then maybe try

01:06:49   and look around to see if anybody else is doing it

01:06:52   and that might end up helping Google, but who knows?

01:06:55   - Yeah, absolutely.

01:06:56   - All right, if you'd like to find our show notes

01:06:59   for this week, head on over to relay.fm/connected/101.

01:07:03   Thanks again to Smile for sponsoring this week's episode.

01:07:06   If you'd like to find Federico online,

01:07:08   He is @vitiici, V-I-T-I-C-C-I and writes at maxstories.net.

01:07:12   Steven is @ismh and he is at 512pixels.net.

01:07:16   And I am @imike, I-M-Y-K-E.

01:07:20   We'll be back next time.

01:07:21   Until then, say goodbye, guys.

01:07:23   - Arrivederci.

01:07:24   - Adios.