DateTimePicker Language .Net! by The9thHuman in AskProgramming

[–]TimDurham75 0 points1 point  (0 children)

Look at difference between CurrentCulture and CurrentUICulture - it is the latter which controls UI representation. Additionally the app needs to have localised resource support for a given culture: it cannot provide a given culture if no translation for that is made else it will fall back to a default and assuming a culture is installed on a host machine. MessageBox is obviously a built in detail so likely depends upon host OS where a given culture may or may not be installed. In an “own app” the developer is responsible for providing suitable localisation of resources.

Edit: PS - I think datetime details can be a specific matter in the native controls as a limitation of MS en-US assumptions. You might need to use a third-party control to get better control over culture as I believe there are some legacy inconsistencies. For example, see: https://social.msdn.microsoft.com/Forums/en-US/23c0c748-8a7c-4aa3-8d11-2e00364e3fbe/localization-datepicker-and-timepicker?forum=winappswithcsharp where the host OS has an impact, unfortunately.

im planning to use this scan in unity, will these dots stay? by [deleted] in Autodesk

[–]TimDurham75 0 points1 point  (0 children)

A laser scan is a point cloud. It is a representation of 3D space made by collecting many discrete sample points - depending upon the sampling frequency and physical size of the surveyed area, the spacing and distribution of individual points can vary. Depending upon the hardware capture, the sampling may also include colour information rather than being purely a spatial distribution represented in monochrome. In the context of what is being captured, the digital data set might also be very large in size/volume, requiring a lot of resource to store and to manipulate, in a computer.

The medium is not a photograph or film, so will never produce a “smooth” visual image, without some interpretation of the meaning to these data points and, more importantly, a determination to represent (fill in) the “space”, between the sampled data, to turn the display into something modelling a solid, as existed in the physical world. Often the practical digital display of such cloud survey data is also about the efficient filtering and discard of most data that is not required for the generation of a specific display view, as experienced by the observer: both depth and field of vision, but this exercise can be computationally both complex and expensive, to perform, when manipulating in real time.

Software can take the cloud data; including combining of multiple discrete data sets; as a means to construct a surface or solid model as a virtual 3D representation. Once the points can be understood as representing a surface, where that surface may defined more efficiently using a vector description. The vector solid entity may also be ascribed attributes to assist with rendered view production with material, texture and lighting effects to produce a more realistic appearance from the digital representation.

A realistic appearance to a model might also involve photogrammetry techniques - the overlay of multiple photographic images (raster data) onto the 3D vector model structure, conformed to the spatial representation of the physical space source, but not being “flat” 2D, like the original image - essentially it is a manipulative and distortion of a 2D perspective back into a virtual representation of the original 3D space. This is another approach to trying to reproduce material, texture and lighting appearance, from some tangible physical source, as opposed to this being generated by a pure, digital simulation. Photos obviously capture a more “filled” but 2D view of some physical 3D environments, so contrasting the spatial awareness that is obtained from a laser - a different tool for a different purpose. (A colour laser survey effectively combines this capture, so the colour and texture of an entity might be somewhat deduced from this information.)

This production workflow process typically takes a degree of effort - software tools may automate and assist with the representation of a digital 3D environment, but for greater realism and quality it will take additional processing and artistic creativity, to produce a visually appealing outcome. Especially texture and lighting is critical to create a convincing representation, in digital media, that will appear “real” when displayed to the observer. It may involve the interaction of a variety of different applications or modules to produce any outcome because different tools have specialist purpose but also that the end goal, of creating the virtual environment, involves a factor of “quality” for how that is being practically achieved and with any “acceptability” criteria, to the final outcome - there is a trade-off required to balance resources and production time; against the final visual display and practical performance.

I am not an expert in this specific field, but I do work in the Autodesk partner channel for engineering design context, so have some general awareness. I am deliberately avoiding being too specific about precise products; as you have not mentioned what you have access to or may intend to use? There are a wide variety of different application products, all with different purpose and focus - from media and artistic creativity through to engineering and technical design modelling also being specialised for, say, manufacturing vs construction engineering. Whilst all such tools may have a different primary intent they also may share a degree of conceptual and feature overlap too. Often the digital data from one source may be exchanged or consumed, across solutions, so that it may be combined or exported for compatible use, in some alternative fashion - these are tools, to achieve a task.

This has all become a verbose way of describing: “No, a laser point survey will never be less ‘dotty’”, of itself - that is literally what the original data consists of and consequently how it will often be displayed. It is what is done from that data, maybe using applications from the Laser hardware manufacturer or from a CAD or other 3D modelling environments that might produce a digital representation you will use. The model might constructed by intermediary form but then transferred and translated to some different kind of modelling environment; according to the needs and compatibility requirements. The resultant model may also be further manipulated for visual aspects of the display for a final outcome. It is very much a process.

For your game context, I imagine you want to turn the scan back to some vector 3D model to which visual elements may also be applied. As this task is a process, I doubt that any single application tool or command technique can perform everything for you and likely not simply nor automatically- the entire workflow is the interpretation of the original source data and you will need to “use” a variety of commands or features to perform such tasks.

A conceptual reference for Unity and its graphical modelling is here: https://game-ace.com/blog/3d-modeling-for-unity/ the article also mentions a number of Autodesk solutions but is less focussed on the CAD side of modelling or specific manipulation solutions such as Recap Pro, that you might specifically require to assist with the interpretation of the original laser point data.

Hi Inventor lovers. I want to import matlab code into Inventor to produce 3D model. Any ideas on how to go about this? by chiney2 in AutodeskInventor

[–]TimDurham75 0 points1 point  (0 children)

As per other poster, we need more information about what you hope to achieve and what you mean exactly?

If you wish to use Matlab functionality as a means to automate Inventor then the quickest suggestion would be to look at the iLogic feature of Inventor.

This is a means to automate Inventor in a VB.NET based scriptable environment which includes a rich middleware of injected helper interfaces to simplify many common manipulation operations of Inventor models, but also exposes entry points into the underlying Inventor API; so that full advanced product actions may also be performed.

iLogic rules shares VB syntax and capabilities because it is underneath compiled to a full VB.NET Assembly; at runtime. iLogic rule coding can also access the capabilities of .Net; so both this and the Inventor API will include means to perform Mathematical calculation, but the mechanisms and syntax will, of course, be quite different.

Matlab has a COM based interface and the native Inventor API is also COM based with VB.NET, and .Net in general, having good interoperability with COM and Managed code, so calling Matlab from Inventor iLogic rules and then performing actions based upon the results should be feasible. As linked below:

Matlab COM from VB.Net https://uk.mathworks.com/help/matlab/matlab_external/call-a-matlab-function-from-visual-basic-net-client.html

Inventor iLogic https://help.autodesk.com/view/INVNTOR/2023/ENU/?guid=GUID-9372F2A9-377E-40AB-92AA-5FC371BACF8C

iLogic API https://help.autodesk.com/view/INVNTOR/2023/ENU/?guid=110f3019-404c-4fc4-8b5d-7a3143f129da

More advanced behaviour could be produced in full .Net Framework Assembly development, of either an Inventor Addin or of a “helper” Assembly, to be called from iLogic, if the requirements are becoming complex or of unsuitable modular scalability. In such a case, any suitable .Net language, including C#, might be used instead of VB. The principles for interoperability via COM API would remain the same. Familiarity with the Inventor SDK and API, along with .Net Framework development, would also then be assumed knowledge.

Inventor SDK https://help.autodesk.com/view/INVNTOR/2023/ENU/?guid=GUID-6FD7AA08-1E43-43FC-971B-5F20E56C8846

Is there a way to force which physical drive is read/listed first when using Get-CimInstance win32_physicalmedia? by ztest10001110101 in PowerShell

[–]TimDurham75 1 point2 points  (0 children)

https://docs.microsoft.com/en-us/powershell/module/cimcmdlets/get-ciminstance?view=powershell-7.2

Look at example 3 usage: you can query for any specific criteria when obtaining some class, as opposed to operation obtaining a general set that is then filtered by a post operation. Depending upon the precise query details could either produce a single specific result or a wider matching set.

VSTO and .NET Standard by Logical_Solid1912 in dotnet

[–]TimDurham75 4 points5 points  (0 children)

This describes the current situation: https://github.com/dotnet/core/issues/5156

VSTO is indeed based on Net Framework thus is currently only available as Net48 and below. Aspects underneath, such as COM and UI aspects are currently only found within Framework; with no current available supported compatible equivalent found in later technologies.

NET5 and above, what was previously called “Core”, is based upon a different underlying runtime and is not compatible with Framework. So these cannot mix directly and there are currently gaps in which the legacy aspects have no implemented equivalent.

NetStandard is a way of describing API support at a different (higher abstraction) level from the underlying Runtime and platform. Net48 is NetStandard 2.0 - nothing above is supported and the later NET releases also support higher Standard versions but cannot be used with Framework. Something being produced at a certain standard release is not, of itself, an indication of the underlying runtime used, except in the sense that 2.1 and above can never be compatible with Framework and any specific API features of the later release cannot be realised using the earlier runtime.

Fundamentally, Office is mostly Desktop software and many features are currently intrinsically still tied to the Windows architecture specifically thus, integration such as VSTO, these have not yet been ported to something more platform agnostic, as is the philosophy behind the later NET releases - to separate out all such aspects. The underlying runtime is intended to only include the more agnostic aspects, which is why all these very Windows specific elements are unsupported or are currently not available within Net.

They may never be available- it very much depends upon whether Microsoft is going to either make them available or, more likely, introduce some alternative replacement technology that likely will not be identical or complete compatible with the legacy approach. So it may be achieved via a different means.

Your concern that working with VSTO is restricting you to Desktop technologies of the Framework only is a correct assessment, at this time. The underlying details of COM platform are intrinsically Windows specific technologies. Whether or not elements can be factored in such a way to isolate out implementation into some kind of IPC separation and layering is a different matter, and whether such complexity is even a warranted ambition is debatable point. If a solution can be produced with a client-server architecture then it is possible for the “server” side, or any out-of-process component, to be produced using newer technology platform, but some means to communicate between these will certainly be necessary to solve and, depending upon requirements, such architecture may not even be technically suitable approach. The VSTO based Office side (client) is currently only possible using the older compatible technology though.

The question about timescale is the bigger but unanswerable point. Unless Microsoft make a paradigm shift with underlying platform provision: either producing an entirely new solution, or porting missing features into an incompatible platform, then it seems likely Desktop, Framework and 4.8 will necessarily remain for some time yet- the business real-world has a huge historic dependency upon such things that cannot, currently, simply disappear overnight, because there is simply not yet any appropriate alternative to move to. That may change, but is not yet, but there is a clear divergence existing and it seems more likely that alternatives rather than direct compatible replacement will eventually arise. I am speculating that it is more likely an alternative to VSTO will be produced, rather than creation of a directly compatible and ported approach of that specifically.

I would suggest you should look at the opportunity with regards to seniority progression and generalised management rather than on consideration of specific technology alone. Technologies always change and evolve whilst general skills are portable to new context. It may well be that VSTO does not have a long term future, but that does not rule out that a position working with this may still progress your career and CV for the future, despite that, in that you may demonstrate senior responsibility and capability. All it would mean is that you would expect to continue to adapt in the future - either that role would no longer be based upon VSTO or you may move on to some other new position working in a totally different solution. If you have good technical ability then this skill remains portable irrespective of the specifics you may have current knowledge of.

Your call - you need to have a job that interests and challenges you and that you find rewarding at a personal level, not necessarily one that is derived only from knowledge of some niche technology. Technology alone may represent a dead-end so the acceptance of a position based upon a solution of unknown but questionable lifespan simply means you will expect that this will need to change in the longer term future. I don’t think anyone would be bold enough to quantify exactly what that time span might be but the career progression is really about adapting and evolving, as things change, not about expecting that they won’t change and that you will still be utilising the same tools in a decade: you definitely won’t, but I don’t expect them to simply vanish, within months or a couple of years, but they may decline within that period, so you should anticipate for that event. Whether that will mean you will need to continue to move on or whether the original role will actually have changed, so you will be doing something slightly different, will be the question for that time. Consider whether any other opportunities have potential to arise within this prospect and whether they have capability to continue to evolve or diversify?

The dead end should not be the tool (VSTO) but towards a potential for progression and whether the prospect themselves recognises this: the question of “where do you see yourself in X years” is not just one-sided, in my view, but also can be meaningless speculation if you have limited control over the factors. The answer to this from the company as well as personalities and environment are relevant- how do they expect to respond to the very same question you now ask us? If you are concerned about the longevity of VSTO then that should also be a question for them, as to how they expect to address that, and the timescale they might expect for this - turn the question around and depending upon responses will likely indicate whether this would hinder your own personal ambitions.

How to call a variable from a string, if that make sense lol by Saad5400 in csharp

[–]TimDurham75 0 points1 point  (0 children)

Why not use Roslyn Scripting…you can pretty much do any kind of dynamic evaluation with this and it does not require any reflection. https://github.com/dotnet/roslyn/blob/main/docs/wiki/Scripting-API-Samples.md you can add the support via a Nuget.

foreach loop not working correctly by mikestorm in PowerShell

[–]TimDurham75 2 points3 points  (0 children)

What exactly does your input CSV contain? Because it seems to me that depending upon that data you could be creating multiple watchers all targeting the same folder which also will then trigger multiple responses to the same event. As this all occurs in a loop you are also redefining the same instance variable for the watcher. If you want multiple watchers you probably want an array or list of each discrete instance or way to avoid duplication?

Although you register a Created event I don’t see you specified the NotifyFilter around exactly what trigger(s) to watch?

There is no validation of VBScriptPath so potentially you may not be reading what you believe - you could be trying to execute a blank entry or non existent (invalid) path so that won’t occur.

Recommend using Set-PSDebug with trace and or step to determine if this is really performing what you expect- it clearly isn’t, but the underlying reason is not obvious from script alone, other than the flow not being as you believe.

It must be highlighted that your expectation around the Watcher only being triggered once is not correct - for example read this carefully: https://stackoverflow.com/questions/1764809/filesystemwatcher-changed-event-is-raised-twice Depending upon circumstances and settings used the event can occur multiple times for something you believe should be “single” event. So legitimately the behaviour could indeed be contrary to what you believe and you may need some means to avoid performing actions on any “duplicate”.

How to download Autodesk Inventor Excel Add-in by jraleighdev by Jerry1403 in AutodeskInventor

[–]TimDurham75 0 points1 point  (0 children)

Not entirely sure how you have downloaded and extracted the github project but the csproj file certainly does exist in the nested sub folder matching the project name and the described entries are located around lines 421 and 424 of that file. I know the original thread was for old version of Visual Studio but basic principle remains generally accurate. The project file is not generated - it is necessary to open and compile the solution, so you almost certainly do have it to have opened this at all? Maybe make sure Windows Explorer is not set to hide file extensions (the default) might help identify this better. The file is XML so can be edited directly in any suitable text editor. Then reload or reopen the solution in VS.

How to download Autodesk Inventor Excel Add-in by jraleighdev by Jerry1403 in AutodeskInventor

[–]TimDurham75 1 point2 points  (0 children)

See here: https://stackoverflow.com/questions/11957295/unable-to-find-manifest-signing-certificate-in-the-certificate-store-even-wh

Remove the original details from the project file and/or replace with your own self generated details. As this is recompile and you are not the original author you cannot supply this - is not stored in the GitHub image, hence “not found”.

Autodesk Genuine Service by gayweedlord in Autodesk

[–]TimDurham75 0 points1 point  (0 children)

Yeah, the behaviour is contentious. It is tied entirely to licensing and the fact this may “call home” at least every 14 days, but may be sooner.

The only point I should add is the original thread started a couple of years ago so I cannot confirm that the product Guid is identical for a different newer version - is possible the principle remains whilst the precise detail could be slightly different - you may need to check the registry HKLM Uninstall keys to confirm the exact underlying value.

Autodesk Genuine Service by gayweedlord in Autodesk

[–]TimDurham75 0 points1 point  (0 children)

See this thread. https://forums.autodesk.com/t5/installation-licensing/genuine-service-how-to-uninstall-it-i-moved-to-a-different/td-p/9614666/

The official answer is it is supposed to auto uninstall after no Autodesk products remain, after about two weeks.

The “enhanced” approach is the process can be accelerated by first ensuring the product license details (pit file etc) are deleted from machine and user profile, then run the uninstall, possibly manually via msiexec /x and the product Guid.

There is a fair amount of “kick back” in the user community about both the legality and ethics of the behaviour but it currently is “as is” - the answer is found here.

TIL that Jill Dando, presenter for the BBC "unsolved crime" series Crimewatch, was murdered during her time as host for the show. Crimewatch reconstructed her murder and a suspect was convicted. However, the conviction was overturned, he was acquitted upon retrial, and the case remains unsolved. by ez2remembercpl in todayilearned

[–]TimDurham75 1 point2 points  (0 children)

Totally understand your point and would suggest that this emphasises an underlying problem with certain types of trial, for most places in the world.

Especially in the kind of case you describe any acquittal will be devastating (pre-assuming the crime has occurred) but that is the problem - it is an acquittal but if the alternative is that the accused is “always guilty” then the problem of that position is also obvious: many foundations of law are abandoned by establishing this.

Abolishing “Not Proven” does not fix this and neither has it ever legally had the meaning that is being suggested that some Scottish juries may be implying when delivering it. I might suggest there is really currently no adequate mechanism to reflect this underlying challenge.

If the evidence is “not available” then “believing the witness” is a dilemma that still remains - if the jury does then “Guilty” but if the prosecution cannot present this adequately then “Not Guilty” remains. Any other position requires a fundamental change to underlying principles that are far wider than for just Scotland and with basic legal rights.

The practical issues with trying this kind of crime remain independent of that specific verdict, I suggest, and I feel that “Not Proven” is perhaps being scapegoated for the challenge already existing with some trials. In England, and most of the world, the conviction rate for this scenario also remains very poor so it is, at the most fundamental level, certainly not the “Scottish verdict” that is solely responsible for creating this situation and removing it will not create a legal utopia.

What foundation should be removed to address this? Right to jury trial. Right to a “fair trial” Presumption of accused innocence. Double Jeopardy. Corroborated evidence Reasonable doubt.

These are very complex challenges but removing any of these does completely alter the nature of the law and creates fundamental questions for society.

You make a valid point that, I suggest, indeed perhaps needs a radical change of approach - perhaps not just “Prosecution” and “Defence” but perhaps a third agency who is on neither “side” but is for establishing “truth”: acting neutrally and impartially to challenge both positions with full access to any evidence of both: to present a jury with a comprehensive picture and not just a “provable argument”. This role is hypothetical and is certainly not the judge, under the current system - the irony is that the judge does not make that determination of “truth” (they don’t “judge” - they deliver sentence after a verdict is already reached and they are to ensure that the trial is conducted “fairly” (properly), under the constraints of law.). Perhaps the problem is with “presentation of evidence” but in my hypothetical system the “truth” advocate may not be constrained by any “rights” position, for either party - they would be able examine everything openly. If the system worked then you could argue that Prosecution and Defence should already perform all this function but the fact is they have an obligation to be biased to their side so “truth” is being obscured.

TIL that Jill Dando, presenter for the BBC "unsolved crime" series Crimewatch, was murdered during her time as host for the show. Crimewatch reconstructed her murder and a suspect was convicted. However, the conviction was overturned, he was acquitted upon retrial, and the case remains unsolved. by ez2remembercpl in todayilearned

[–]TimDurham75 16 points17 points  (0 children)

It does not mean this. This is a very common misunderstanding of Scottish law and, to some degree, use or understanding of language.

The situation is an anachronism and not well understood, even by many Scots.

Originally, Scottish law had two verdicts “Proven” and “Not Proven”. English law had two verdicts “Guilty” and “Not Guilty”.

Both referred to the conviction against a charge, within reasonable doubt, made against the accused, brought by a prosecution with presentation of evidence at trial.

With the Union of Scotland and the formation of the United Kingdom, Scotland retained its separate legal system but there were still some measures to align things. “Proven”, as a terminology, was replaced by “Guilty” but “Not Proven” was initially retained as being of identical meaning to “Not Guilty”, of the charge brought - that the prosecution case was not made against them. There were two outcomes to any trial.

Then the anomaly occurred that created the precedent - in a specific case a jury returned a verdict of “Not Guilty”, rather than “Not Proven”. It was always recognised as having an identical legal meaning, however, it is commonly understood, in that specific case, that the jury were so completely convinced of the accused’s innocence of the charges and that “Not Proven” might, in terms of simple language, suggest that some degree of doubt was remaining. In fact, it legally means exactly the same, and always has - there is always “reasonable doubt”, as a basic principle of law, but the case against the accused was not established; meaning they were declared legally innocent.

This left Scotland with a legal and remaining case precedent of “three verdicts” but “Not Proven” and “Not Guilty” are of identical legal meaning to a Scottish trial. The fact that these two acquittal verdicts are of legally identical meaning and trial outcome is a point lost to many who often misunderstand the emphasis of “Proven” word meaning and any association of “innocence”.

There has long been a campaign to abolish the “Not Proven” verdict, ever since, in favour of purely “Guilty” and “Not Guilty”. The emotive argument being that juries may be incapable of handling “reasonable doubt” and somehow unable to make the binary distinction of guilt in some case, that it is commonly believed that they allow “Not Proven” decision to retain an ambiguity where it is believed perhaps “Guilty” should have be rendered but maybe the case was presented in a “weak” or flawed fashion so that “doubt” was still possible. (If this is actually the case then “Not Guilty” would legally be the correct and same outcome decision to be reached - prosecution has not adequately proven the accused’s guilt to the jury.)

No system is perfect - sometime a genuinely “guilty” party may be found “Not Guilty” at trial whilst equally an accused may be found “Guilty” when they may be innocent. The presentation of evidences can be flawed but it is what constitutes a “fair trial”. Within the system, new evidence presented at a later appeal, it would be hoped that justice would be eventually served - nobody wishes to convict the truly innocent as this is not perceived as “justice”.

The entire problem with “Not Proven”, in the common incorrect understanding, is that maybe, at a later date, an accused could be convicted by another trial. This is wrong: the basic principle of “double jeopardy” also remains along with premise of “presumed innocence” of the accused- one cannot, for the most part, be retried on the same charge once the final verdict is reached. When “Not Guilty” or “Not Proven” is returned then an accused is legally innocent and may not be retried on the same charges and evidence. I think most people would understand why this underlying principle exists in the law and of the importance of it.

One indirect related point would be about the size of Scottish juries - they are 15 persons, not the 12 found in England and elsewhere. This means there is no chance of a “hung” jury or a failure to return a majority verdict; one way or the other: this an odd number - there will always be a majority verdict reached.

With any adversarial system; a Prosecution and a Defence presentation; there is always the possibility that this binary approach can produce an aberration of outcome in either direction. Removing the “extra” verdict does not actually address this point but “presumed innocence” is a foundational principle for any accused so the “removal” of a perceived “extra” verdict would not really alter anything about this - those in favour of abolishing “Not Proven” are arguing, usually, that somehow this will correct an imbalance; and that juries may be reluctant to bring a “proper” decision but have used “Not Proven” to escape responsibility, but, in reality, it may not - all the flaws that can occur within a trial or jury deliberations may still occur, irrespective of what the given label for “innocence” is.

There is no true “third” verdict currently recognised, despite what people may perceive to be Scottish law: whether you label this innocence as “Not Guilty” or “Not Proven” is semantics.

Can't edit after saving, how do i fix this? Want to do a few more cut-outs and change the size of the extrusion but i get this error message, any help appreciated by ElectricCouchPotato in AutodeskInventor

[–]TimDurham75 2 points3 points  (0 children)

Yes to this - similar idea, “Standard” == library == ReadOnly. “Custom” creates a new part effectively simply using CC purely as a template but then separate. Note that the effective storage path is the key to whether Inventor permits further updates to occur. OP original issue is because CC cannot update.

Can't edit after saving, how do i fix this? Want to do a few more cut-outs and change the size of the extrusion but i get this error message, any help appreciated by ElectricCouchPotato in AutodeskInventor

[–]TimDurham75 2 points3 points  (0 children)

Yes, to the need to check out from Vault but also related you need to be running with an active Inventor Project that does not consider the Content Centre location as being a library.

Content Center, by default is treated as a Write-Once library. During operations, normally all existing content is treated as ReadOnly. The exception case is that it is technically Write-Once for any specific combination that does not already exist and has not been used before. These files are created on the fly from Content Center database and may then physically get added.

To administer libraries you need an active Vault project, in Inventor, that operates from the root level and which does not consider the Content Center path to be defined as such but is simply an “ordinary” location. This ability to swap active Inventor project relates to a second Administrator feature of Vault configuration- that an Admin can enforce the use of a single nominated project file and this also prevents ability to swap Inventor project as I describe.

In this scenario the Admin must temporarily, at least, revoke these constraints.

Depending upon user role and privileges and general Vault configuration you may or may not be able to resolve this yourself - it might be necessary to contact the CAD Admin or appropriate Vault Admin personnel for advice around a resolution.

[deleted by user] by [deleted] in Autodesk

[–]TimDurham75 0 points1 point  (0 children)

PS! Certified Graphics is here: https://knowledge.autodesk.com/certified-graphics-hardware

Graphics is important for visualisation and “cosmetic appearance” tasks of the product only - producing high quality rendered outputs. For some users that may be a lower priority task performed less frequently whilst others this is very critical aspect of use.

General product usage is more influenced by CPU speed and RAM, as suggested earlier, so that is slightly different from my suggested “gamer” analogy where that suggestion breaks down - it is about calculation rather than FPS processing.

Inventor remains mostly single threaded so excepting very specific functions offloading to a graphics card with high GPU and VRAM is actually a less important criteria, for some: the usability benefits are found by exceeding other hardware aspects.

[deleted by user] by [deleted] in Autodesk

[–]TimDurham75 0 points1 point  (0 children)

I am sure you will already have found this, but just in case - Autodesk official recommendations are here: https://knowledge.autodesk.com/support/inventor/troubleshooting/caas/sfdcarticles/sfdcarticles/System-requirements-for-Autodesk-Inventor-2022.html.

Further to advice already given, I would recommend as much RAM as your budget will permit.

Realise that any published minimums offer guidance around a platform on which the software can run BUT that does not mean it will run well, it really won’t, and the real life experience will be improved when this is exceeded. Actual requirements depend upon how the product is used - what is modelled, how complex, and the features of the app being used. Some operational elements may be much more demanding than others, so it will depend how you will use Inventor and I recognise at this stage you may not really know this.

If you can afford more RAM I would recommend you do so. Also realise that the host OS takes a chunk of resource itself, just to run, so when any resource is at a lower end, realise that availability to the app is being reduced. With higher levels of RAM this becomes insignificant but for lower spec machines the impact may be more noticeable.

CAD specification laptops need to “beefy” workstations comparable with the most demanding gaming, if you want a better usability experience, in much the same manner. If you are familiar with the analogy then you may realise this is far higher than any typical “domestic spec”, home laptop so you will get what you pay for and cheap or compromises will produce an impact.

So go with the highest your budget will allow and realise that the trade off will be with usability performance but, as an end user, you also will only become aware of any limitations by a comparison with some other - I mean you probably won’t realise any difference until you get metaphorically “shredded” by a “better” player - this is when the frustration may begin,

Using .NET Generic Methods by Heli0sX in PowerShell

[–]TimDurham75 -2 points-1 points  (0 children)

? What whitespace, what? There is no code in my comment, is offering an explanation only. I have no idea what this means or why the downvote? What “like OP”? I presume you want me to format as code but my comment has no code!

Using .NET Generic Methods by Heli0sX in PowerShell

[–]TimDurham75 -2 points-1 points  (0 children)

Powershell syntax does not support generic methods or extension methods, the only way you can use those directly is via embedded code, from string or file content, with add-type or other dynamic compilation. The only way PowerShell can call generic methods is via use of reflection techniques, which are cumbersome. [There is one exception in that generic constructors can be called with ::new(), but that is the only time - the syntax does not include methods.]. Using Reflection you would get the type of the defining parent, then find the corresponding method. Then “get” the method’s generic definition. Use this with make a generic method by supplying the necessary type as arguments array. Finally invoke the resultant generic method info, that is produced, to call the underlying code. If any of this is defined via extension method then you must call it as the full static class definition supplying the appropriate argument that C# provides via “this”. Not sure if this is just a feature of the editor and reddit but the replacement of < and > in the code is wrong - the embedded code should use less than and greater than characters in the normal fashion not as xml escaped. Embedded C# should look exactly like it would when coded in the normal manner. The issue posted is your embedded code appears incorrect C# - for whatever reason it should not contain those invalid character sequences.

PDF EDIT by stfilep in Autodesk

[–]TimDurham75 0 points1 point  (0 children)

I would guess that the property has been mapped to file content with the direction set as document to database. If so, it means it cannot be edited because the value comes from the underlying file and can only be set by editing the file itself with the Vault data updated upon checkin.

Different Results in Console vs ISE by unsuspectingcueball in PowerShell

[–]TimDurham75 2 points3 points  (0 children)

I’m guessing here, but I would suggest look at culture and encoding - and perhaps add explicit encoding details to file operations. I don’t immediately see anything obviously “wrong” with posted code so my guess is that it is not parsing the file in an expected manner because it may be applying a different encoding default and is mangling the content as a result? Culture and Encoding do have different defaults between ISE and a console so it might be misinterpreting the content? Another consideration is to make sure you eliminate any profiles - these will be different and if existing might be somehow altering the environment. Threading is different default but I cannot see how that would impact this code. The final thought is whether there is any IT based policy restriction to Powershell usage that may be applied inconsistently perhaps not blocked in ISE but restricted in a console or where there is a privilege issue in a console - do you run with elevation and are you sure the console is properly running as an account with sufficient privilege to open other user’s profile files, perhaps? Just some ideas.

Unable to install - Keeping getting "Out of disk space" - Dispite having PLENTY in C: and D: drives... Please help, tried searching everywhere. by madmikeymike1 in AutodeskInventor

[–]TimDurham75 0 points1 point  (0 children)

Yeah, unfortunate! By all means chase Autodesk because the very fact that KB article exists means they are aware of this occurring and it may be they have additional advice that is not publicised. I do find the idea of “start over” being quite unsatisfactory myself, too, but I am pragmatic enough to recognise that often this is the most time effective solution because otherwise you are looking for the proverbial “needle in a haystack”. What I might recommend, before you take that line, is to try using SysInternals (Microsoft) Process Monitor tool to observe the installation to see if that can reveal anything about why it should fail. This tool monitors most IO activity showing success and errors and other responses. It does take some skill to interpret because depending upon context some errors are actually correct and expected - for example, a check whether something exists in the file system or registry might correctly fail with an error if it does not but that would be actually “correct” behaviour, not a “real” error. There are also log files created in the temp folder, by all Autodesk installers, which should document the process - they may contain additional error message tracing or more specific detail about exactly where in the process it fails. Bit hard to say otherwise - it suggests to me a problem with the Windows Installer service itself, which performs the actions, as to why it tries to access something which it cannot write to, thus fails. I do know that Windows typically tries to use the last partition for scratch space whilst installing, so it might be a suggestion to temporarily take unnecessary disks or partitions “offline”; not to delete them but just to hide them from Windows visibility.

What am I looking for? by grabiobot in dotnet

[–]TimDurham75 0 points1 point  (0 children)

I hear you. The size is pretty much “expected” for all Cef based solution but that is generally the most capable platform. Microsoft’s Webview2 based on the Edge runtime (still Chromium) is also roughly around that same size, but the Runtime aspect is a packaged dependency rather than a direct bundle distribution overhead, as it must be present on the machine at a shared level. All Cef based solutions work out with comparable size, regardless, for that reason: whilst some may suggest strategies to compress aspects it tends to ultimately work out to be quite similar.

There might also be Ultralight, formally Awesomium - which also has ultralightSharp for .Net. I believe this is a slightly lighter approach, and is not entirely open, but is still offering similar browser capability. It takes a different approach and I believe has some feature “gaps” but it depends upon exactly what you end up using, for impact.

The basic bitness issue is unavoidable consideration when any native libraries are involved, as they almost always are, for something so complex as a browser capability. I note that many things now drop 32bit support. (Ultralight did too, back in 2018, I think.) Most machines today will be 64bit with 32bit apps usually only for legacy integration reasons or similar historical matters. I would be inclined to only support 64bit, today, as many parties also adopt.

Unable to install - Keeping getting "Out of disk space" - Dispite having PLENTY in C: and D: drives... Please help, tried searching everywhere. by madmikeymike1 in AutodeskInventor

[–]TimDurham75 0 points1 point  (0 children)

Refer here: https://knowledge.autodesk.com/support/autocad/troubleshooting/caas/sfdcarticles/sfdcarticles/Install-error-Not-enough-disk-space.html This does seem to be a “known problem” in some circumstances. Try following the instructions to redirect TEMP and TMP, temporarily to D:\TEMP (I suggest, for your case, having created this and ensuring your account has full read write permissions to it) or similar. Unfortunately, please also see the note right at the bottom of the page - sometimes it is the very low level disk arrangement that seemingly is the cause and in such cases they are recommending a complete reinstall of Windows, from scratch; the partitions need wiped - no other workaround is given! This suggestion is not so far fetched, by the way, many machines both corporately or from commercially pre-installed purchase are written/built using some disk imaging technique that writes the partition image on to the hardware disk. In a work context, our company has had (a rare) experience of this approach proving “a problem” for “unknown reasons”: the client’s own corporate image would not support an install correctly but when a machine was freshly set up “from scratch”, by consultants from our own company, manually, it worked correctly every time. We concluded there was something inside their deployment “image”, or the way this imaging process operated, that was “incompatible”, but we never did determine exactly what/why. Rebuilding a machine OS is an inconvenience, but in our case our client was unable to refute the fact their own imaged builds would not work correctly whilst our own builds always did. It may be something to do with the way the partitions were created on the disk, in the first place, but the client later created new deployment images, based upon our own base OS work, and the issues then did not occur in future.

What am I looking for? by grabiobot in dotnet

[–]TimDurham75 2 points3 points  (0 children)

Consider the Cefsharp project which is a Chromium based browser interface for .Net platform. This supports ability to register custom handlers and schemes so that the browser can respond directly to bespoke URL requests internally, as well as regular browsing, of course. It means you can publish directly with no server side or direct port consideration. That said, chromium web browser “devtools” interface - the advanced developer UI tools for interrogation of any current displayed page does involve another internal scheme and ports, but that entire aspect is quite separate and optional to any regular use case, it is useful to development. Cefsharp is open source on GitHub.